Skip to content

Actions: OpenNMT/OpenNMT-py

All workflows

Actions

Loading...
Loading

Showing runs from all workflows
194 workflow runs
194 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

use flash_attn_with_kvcache for faster inference (#2539)
Lint & Tests #1321: Commit 0436cdd pushed by vince62s
December 26, 2023 10:08 4m 57s master
December 26, 2023 10:08 4m 57s
use flash_attn_with_kvcache for faster inference
Lint & Tests #1320: Pull request #2539 synchronize by vince62s
December 26, 2023 09:42 4m 53s vince62s:newcache
December 26, 2023 09:42 4m 53s
use flash_attn_with_kvcache for faster inference
Lint & Tests #1319: Pull request #2539 synchronize by vince62s
December 26, 2023 09:37 2m 38s vince62s:newcache
December 26, 2023 09:37 2m 38s
use flash_attn_with_kvcache for faster inference
Lint & Tests #1318: Pull request #2539 synchronize by vince62s
December 19, 2023 15:57 5m 1s vince62s:newcache
December 19, 2023 15:57 5m 1s
use flash_attn_with_kvcache for faster inference
Lint & Tests #1317: Pull request #2539 synchronize by vince62s
December 19, 2023 15:53 3m 23s vince62s:newcache
December 19, 2023 15:53 3m 23s
use flash_attn_with_kvcache for faster inference
Lint & Tests #1316: Pull request #2539 synchronize by vince62s
December 19, 2023 15:29 4m 56s vince62s:newcache
December 19, 2023 15:29 4m 56s
use flash_attn_with_kvcache for faster inference
Lint & Tests #1315: Pull request #2539 synchronize by vince62s
December 19, 2023 15:10 4m 56s vince62s:newcache
December 19, 2023 15:10 4m 56s
use flash_attn_with_kvcache for faster inference
Lint & Tests #1314: Pull request #2539 opened by vince62s
December 19, 2023 15:09 2m 43s vince62s:newcache
December 19, 2023 15:09 2m 43s
MoE for mixtral 8x7b (#2535)
Lint & Tests #1313: Commit 05cde4d pushed by vince62s
December 19, 2023 14:48 4m 45s master
December 19, 2023 14:48 4m 45s
MoE for mixtral 8x7b
Lint & Tests #1312: Pull request #2535 synchronize by vince62s
December 19, 2023 14:41 4m 49s vince62s:mixtral
December 19, 2023 14:41 4m 49s
Fixmask (#2538)
Lint & Tests #1311: Commit 2509d93 pushed by vince62s
December 18, 2023 11:41 5m 34s master
December 18, 2023 11:41 5m 34s
Fixmask
Lint & Tests #1310: Pull request #2538 synchronize by vince62s
December 18, 2023 11:28 4m 55s vince62s:fixmask
December 18, 2023 11:28 4m 55s
Fixmask
Lint & Tests #1309: Pull request #2538 synchronize by vince62s
December 18, 2023 10:20 4m 56s vince62s:fixmask
December 18, 2023 10:20 4m 56s
Fixmask
Lint & Tests #1308: Pull request #2538 opened by vince62s
December 18, 2023 09:27 4m 11s vince62s:fixmask
December 18, 2023 09:27 4m 11s
Defaultflash (#2537)
Lint & Tests #1307: Commit 23f7916 pushed by vince62s
December 18, 2023 07:50 4m 52s master
December 18, 2023 07:50 4m 52s
Defaultflash
Lint & Tests #1306: Pull request #2537 opened by vince62s
December 18, 2023 07:32 5m 22s vince62s:defaultflash
December 18, 2023 07:32 5m 22s
Apply the attention mask in all decoding steps (LM inference) (#2532)
Lint & Tests #1305: Commit f01bea1 pushed by vince62s
December 15, 2023 11:08 4m 51s master
December 15, 2023 11:08 4m 51s
Apply the attention mask in all decoding steps (LM inference)
Lint & Tests #1304: Pull request #2532 synchronize by l-k-11235
December 15, 2023 10:31 4m 48s l-k-11235:masked_attn_test
December 15, 2023 10:31 4m 48s
Apply the attention mask in all decoding steps (LM inference)
Lint & Tests #1303: Pull request #2532 synchronize by l-k-11235
December 15, 2023 10:25 4m 51s l-k-11235:masked_attn_test
December 15, 2023 10:25 4m 51s
Apply the attention mask in all decoding steps (LM inference)
Lint & Tests #1302: Pull request #2532 synchronize by l-k-11235
December 15, 2023 10:21 4m 43s l-k-11235:masked_attn_test
December 15, 2023 10:21 4m 43s
Apply the attention mask in all decoding steps (LM inference)
Lint & Tests #1301: Pull request #2532 synchronize by l-k-11235
December 15, 2023 10:08 5m 36s l-k-11235:masked_attn_test
December 15, 2023 10:08 5m 36s
Apply the attention mask in all decoding steps (LM inference)
Lint & Tests #1300: Pull request #2532 synchronize by l-k-11235
December 15, 2023 09:48 4m 48s l-k-11235:masked_attn_test
December 15, 2023 09:48 4m 48s
Apply the attention mask in all decoding steps (LM inference)
Lint & Tests #1299: Pull request #2532 synchronize by l-k-11235
December 15, 2023 08:54 4m 54s l-k-11235:masked_attn_test
December 15, 2023 08:54 4m 54s
Extend llamalike-converter (#2536)
Lint & Tests #1298: Commit cd49e76 pushed by vince62s
December 14, 2023 13:00 6m 27s master
December 14, 2023 13:00 6m 27s
Apply the attention mask in all decoding steps (LM inference)
Lint & Tests #1297: Pull request #2532 synchronize by l-k-11235
December 14, 2023 11:07 4m 57s l-k-11235:masked_attn_test
December 14, 2023 11:07 4m 57s