×
Well done. You've clicked the tower. This would actually achieve something if you had logged in first. Use the key for that. The name takes you home. This is where all the applicables sit. And you can't apply any changes to my site unless you are logged in.

Our policy is best summarized as "we don't care about _you_, we care about _them_", no emails, so no forgetting your password. You have no rights. It's like you don't even exist. If you publish material, I reserve the right to remove it, or use it myself.

Don't impersonate. Don't name someone involuntarily. You can lose everything if you cross the line, and no, I won't cancel your automatic payments first, so you'll have to do it the hard way. See how serious this sounds? That's how serious you're meant to take these.

×
Register


Required. 150 characters or fewer. Letters, digits and @/./+/-/_ only.
  • Your password can’t be too similar to your other personal information.
  • Your password must contain at least 8 characters.
  • Your password can’t be a commonly used password.
  • Your password can’t be entirely numeric.

Enter the same password as before, for verification.
Login

Grow A Dic
Define A Word
Make Space
Set Task
Mark Post
Apply Votestyle
Create Votes
(From: saved spaces)
Exclude Votes
Apply Dic
Exclude Dic

Click here to flash read.

The CTC model has been widely applied to many application scenarios because
of its simple structure, excellent performance, and fast inference speed. There
are many peaks in the probability distribution predicted by the CTC models, and
each peak represents a non-blank token. The recognition latency of CTC models
can be reduced by encouraging the model to predict peaks earlier. Existing
methods to reduce latency require modifying the transition relationship between
tokens in the forward-backward algorithm, and the gradient calculation. Some of
these methods even depend on the forced alignment results provided by other
pretrained models. The above methods are complex to implement. To reduce the
peak latency, we propose a simple and novel method named peak-first
regularization, which utilizes a frame-wise knowledge distillation function to
force the probability distribution of the CTC model to shift left along the
time axis instead of directly modifying the calculation process of CTC loss and
gradients. All the experiments are conducted on a Chinese Mandarin dataset
AISHELL-1. We have verified the effectiveness of the proposed regularization on
both streaming and non-streaming CTC models respectively. The results show that
the proposed method can reduce the average peak latency by about 100 to 200
milliseconds with almost no degradation of recognition accuracy.

Click here to read this post out
ID: 900; Unique Viewers: 0
Unique Voters: 0
Total Votes: 0
Votes:
Latest Change: March 17, 2023, 7:35 a.m. Changes:
Dictionaries:
Words:
Spaces:
Views: 1074
CC:
No creative common's license
Comments: