Comments (17)
Are you using the gst-kaldi-nnet2-online plugin? If yes, did you update it's to latest trunk and recompile?
from kaldi-gstreamer-server.
I have updated it, i'm able to get the likelihood scores and confidence scores.
from kaldi-gstreamer-server.
Can you copy/paste you configuration file?
Also, do you see any errors in the server or worker logs?
from kaldi-gstreamer-server.
Here is my configuration file
You have to download TEDLIUM "online nnet2" models in order to use this sample
Run download-tedlium-nnet2.sh in 'test/models' to download them.
use-nnet2: True
decoder:
use-threaded-decoder: true
model : /home/azureuser/speech/server/kaldi-gstreamer-server-master/test/models/english/fisher_nnet_a_gpu_online/final.mdl
word-syms : /home/azureuser/speech/server/kaldi-gstreamer-server-master/auxo/lm/words.txt
fst : /home/azureuser/speech/server/kaldi-gstreamer-server-master/auxo/lm/HCLG.fst
mfcc-config : /home/azureuser/speech/server/kaldi-gstreamer-server-master/test/models/english/fisher_nnet_a_gpu_online/conf/mfcc.conf
ivector-extraction-config : /home/azureuser/speech/server/kaldi-gstreamer-server-master/test/models/english/fisher_nnet_a_gpu_online/conf/ivector_extractor.conf
max-active: 7000
beam: 15.0
lattice-beam: 6.0
acoustic-scale: 0.083
lm-fst: /home/azureuser/speech/server/kaldi-gstreamer-server-master/auxo/lm/G.fst
do-endpointing : True
endpoint-silence-phones : "1:2:3:4:5:6:7:8:9:10"
traceback-period-in-secs: 0.25
chunk-length-in-secs: 0.25
num-nbest: 10
out-dir: tmp
use-vad: False
silence-timeout: 10
Just a sample post-processor that appends "." to the hypothesis
post-processor: perl -npe 'BEGIN {use IO::Handle; STDOUT->autoflush(1);} s/(.*)/\1./;'
full-post-processor: ./sample_full_post_processor.py
logging:
version : 1
disable_existing_loggers: False
formatters:
simpleFormater:
format: '%(asctime)s - %(levelname)7s: %(name)10s: %(message)s'
datefmt: '%Y-%m-%d %H:%M:%S'
handlers:
console:
class: logging.StreamHandler
formatter: simpleFormater
level: DEBUG
root:
level: DEBUG
handlers: [console]
from kaldi-gstreamer-server.
Please use markdown syntax to make such things readable (https://guides.github.com/features/mastering-markdown/)
My second question was: do you see any errors in the server or worker logs?
from kaldi-gstreamer-server.
this is my server log after a decoding is complete and when i send a new file to decode
INFO 2015-07-01 14:11:47,999 Worker <main.WorkerSocketHandler object at 0x7f21aa3ac210> leaving
INFO 2015-07-01 14:11:48,902 ce903546-4d5d-4ae5-bea9-d6bee6151292: Handling on_connection_close()
INFO 2015-07-01 14:11:48,903 ce903546-4d5d-4ae5-bea9-d6bee6151292: Closing worker connection
INFO 2015-07-01 14:11:49,052 New worker available <main.WorkerSocketHandler object at 0x7f21aa3ac3d0>
INFO 2015-07-01 14:12:57,718 Status listener left
INFO 2015-07-02 02:54:18,959 0d516e2d-b038-4546-9c79-7c939c9937e7: OPEN: user='none', content='none'
INFO 2015-07-02 02:54:18,960 0d516e2d-b038-4546-9c79-7c939c9937e7: Using worker <main.HttpChunkedRecognizeHandler object at 0x7f21aa3acdd0>
INFO 2015-07-02 02:55:09,096 0d516e2d-b038-4546-9c79-7c939c9937e7: Handling the end of chunked recognize request
INFO 2015-07-02 02:55:09,098 0d516e2d-b038-4546-9c79-7c939c9937e7: yielding...
INFO 2015-07-02 02:55:09,099 0d516e2d-b038-4546-9c79-7c939c9937e7: Waiting for final result...
q INFO 2015-07-02 03:09:58,502 Worker <main.WorkerSocketHandler object at 0x7f21aa38bbd0> leaving
INFO 2015-07-02 03:09:58,503 0d516e2d-b038-4546-9c79-7c939c9937e7: Receiving 'close' from worker
INFO 2015-07-02 03:09:58,504 0d516e2d-b038-4546-9c79-7c939c9937e7: Final hyp:
INFO 2015-07-02 03:09:58,507 200 PUT /client/dynamic/recognize (43.249.226.68) 939547.85ms
INFO 2015-07-02 03:09:58,507 Everything done
INFO 2015-07-02 04:01:09,529 New worker available <main.WorkerSocketHandler object at 0x7f21aa3acf50>
INFO 2015-07-02 04:01:28,258 New worker available <main.WorkerSocketHandler object at 0x7f21aa167990>
INFO 2015-07-02 05:48:23,666 57914580-bff0-45e3-bb37-f56cf23e5925: OPEN: user='none', content='none'
INFO 2015-07-02 05:48:23,667 57914580-bff0-45e3-bb37-f56cf23e5925: Using worker <main.HttpChunkedRecognizeHandler object at 0x7f21aa167e10>
from kaldi-gstreamer-server.
OK, what about the worker log?
from kaldi-gstreamer-server.
2015-07-02 04:01:28 - INFO: decoder2: Setting pipeline to READY
2015-07-02 04:01:28 - INFO: decoder2: Set pipeline to READY
2015-07-02 04:01:28 - INFO: main: Opening websocket connection to master server
2015-07-02 04:01:28 - INFO: main: Opened websocket connection to server
from kaldi-gstreamer-server.
This is the log describing worker startup, not the processing of an actual request.
from kaldi-gstreamer-server.
here is the log after the decoder processed a couple of requests , it is not accepting connection from server after this
2015-07-02 06:04:25 - INFO: decoder2: 1ccfa65f-9f98-4d00-821a-3f2b8c7f8d99: Pipeline received eos signal
2015-07-02 06:04:25 - INFO: decoder2: 1ccfa65f-9f98-4d00-821a-3f2b8c7f8d99: Resetting decoder state
2015-07-02 06:04:25 - INFO: main: 1ccfa65f-9f98-4d00-821a-3f2b8c7f8d99: Sending adaptation state to client...
2015-07-02 06:04:25 - DEBUG: ws4py: Closing message received (1000) ''
2015-07-02 06:04:25 - DEBUG: main: 1ccfa65f-9f98-4d00-821a-3f2b8c7f8d99: Websocket closed() called
2015-07-02 06:04:25 - DEBUG: main: 1ccfa65f-9f98-4d00-821a-3f2b8c7f8d99: Websocket closed() finished
2015-07-02 06:04:26 - INFO: main: Opening websocket connection to master server
2015-07-02 06:04:26 - INFO: main: Opened websocket connection to server
from kaldi-gstreamer-server.
here is my configuration file uploaded
http://textuploader.com/3qep
from kaldi-gstreamer-server.
I'm sorry, but I need a worker log from the startup until the 1st request has been processed.
from kaldi-gstreamer-server.
ok i will repeat the issue and will share the complete logs server and worker. can i get your email ID
from kaldi-gstreamer-server.
from kaldi-gstreamer-server.
Hi Alumae,
the problem mostly seems like this #10
but now it is like after a timeout instead of first decoding.
from kaldi-gstreamer-server.
The timeout problem was caused by Azure cloud which closes inactive websocket connections.
from kaldi-gstreamer-server.
Hi rohithkodali
I am experiencing the same problems you have experienced. Could you tell me how to solve it?
client:
{"status": 0, "hypotheses": [{"utterance": ""}], "id": "1a331ae8-cb64-48ce-ba87-d52476a9d5b4"}
master server:
INFO 2018-11-13 10:41:44,721 131493e0-9cef-4bbb-8c43-f147b618347d: Final hyp:
from kaldi-gstreamer-server.
Related Issues (20)
- python kaldigstserver/client.py -r 32000 test/data/english_test.raw gives only THE. as output
- single word audio file gives multiple results, how to choose the correct result? HOT 1
- gstkaldinnet2onlinedecoder vs online2-tcp-nnet3-decoder-faster HOT 7
- The pretrained Chinese model can not process audio file with 48khz sample rate HOT 1
- Error switching between Audio types
- Error when running python kaldigstserver/worker.py on sample chinese HOT 1
- How to get phone alignment and word alignment information
- decoder with CSJ -> worker: segmentation fault (core dumped) HOT 2
- Enable Multiple channels listening HOT 1
- setting up the server for http api call HOT 2
- INTEL MKL ERROR: /opt/intel/mkl/lib/intel64/libmkl_avx2.so: undefined symbol: mkl_sparse_optimize_bsr_trsm_i8. Intel MKL FATAL ERROR: Cannot load libmkl_avx2.so or libmkl_def.so. HOT 2
- server can not get EOS HOT 1
- when I run it in doocker and use the chinese model ,it have this question :2021-04-16 05:52:17 - INFO: __main__: 7404beee-0d39-4d67-963c-01c58da10193: Waiting for EOS from decoder 2021-04-16 05:52:18 - INFO: __main__: 7404beee-0d39-4d67-963c-01c58da10193: Waiting for EOS from decoder HOT 2
- How to run multiple models in a single machine
- How can I save the incoming audio stream to wav file ?
- Invalid parameters supplied to OnlineLdaInput
- cannot download tedlium_nnet_ms_sp_online.tgz HOT 3
- worker process killed when worker replications reach to 3
- Poor performance with nnet3 TDNN-F model
- Any updates of year 2023???
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from kaldi-gstreamer-server.