Skip to content

Conversation

@tanmay17061
Copy link

With this change, kaldi serve is expected to not crash due to model not found at specified path during ChainModel creation.

Old behaviour:

  1. If a model is not found, KALDI_ERR macro is used to log error.
  2. KALDI_ERR will print to STDERR and raise a std::runtime_error.
  3. This uncaught std::runtime_error will lead to program exit.

New behaviour:

  1. If a model is not found, KALDI_ERR macro is used to log error.
  2. KALDI_ERR will print to STDERR and raise a std::runtime_error.
  3. This std::runtime_error is caught, and entry for the model is not created in the DecoderQueue map.

@tanmay17061
Copy link
Author

tanmay17061 commented May 23, 2023

Testing setup:
Config:

[[model]]
name = "english-v5-new-structure"
language_code = "en"
path = "/home/app/models/english-v5-new-structure"

[[model]]
name = "somemodel"
language_code = "en"
path = "/home/app/models/somemodel"

with /home/app/models/english-v5-new-structure existent, but /home/app/models/somemodel missing.

Output:

:: Loading 2 models
::   - english-v5-new-structure (en)
::   - somemodel (en)
:: Loading model from /home/app/models/english-v5-new-structure
LOG ([5.5.0~1-da93]:RemoveOrphanNodes():nnet-nnet.cc:948) Removed 1 orphan nodes.
LOG ([5.5.0~1-da93]:RemoveOrphanComponents():nnet-nnet.cc:847) Removing 2 orphan components.
LOG ([5.5.0~1-da93]:Collapse():nnet-utils.cc:1472) Added 1 components, removed 2
WARNING ([5.5.0~1-da93]:ChainModel():model/model-chain.cpp:52) Word boundary file/home/app/models/english-v5-new-structure/word_boundary.int not found. Disabling word level features.
WARNING ([5.5.0~1-da93]:ChainModel():model/model-chain.cpp:77) RNNLM artefacts not found. Disabling RNNLM rescoring feature.
LOG ([5.5.0~1-da93]:ComputeDerivedVars():ivector-extractor.cc:183) Computing derived variables for iVector extractor
LOG ([5.5.0~1-da93]:ComputeDerivedVars():ivector-extractor.cc:204) Done.
LOG ([5.5.0~1-da93]:CompileLooped():nnet-compile-looped.cc:345) Spent 0.137849 seconds in looped compilation.
:: Loading model from /home/app/models/somemodel
ERROR ([5.5.0~1-da93]:Input():kaldi-io.cc:756) Error opening input stream /home/app/models/somemodel/HCLG.fst

[ Stack-Trace: ]
/opt/kaldi/src/lib/libkaldi-base.so(kaldi::MessageLogger::LogMessage() const+0xa71) [0x7fe85b26b50f]
/opt/kaldi/src/lib/libkaldi-decoder.so(kaldi::MessageLogger::LogAndThrow::operator=(kaldi::MessageLogger const&)+0x11) [0x7fe85ce63e13]
/opt/kaldi/src/lib/libkaldi-util.so(kaldi::Input::Input(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, bool*)+0x90) [0x7fe85b972b9c]
/opt/kaldi/src/lib/libkaldi-fstext.so(fst::ReadFstKaldiGeneric(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, bool)+0x3d) [0x7fe85c7734b5]
/usr/local/lib/libkaldiserve.so(kaldiserve::ChainModel::ChainModel(kaldiserve::ModelSpec const&)+0xb0f) [0x7fe85ea5710f]
/usr/local/lib/libkaldiserve.so(kaldiserve::DecoderFactory::DecoderFactory(kaldiserve::ModelSpec const&)+0x140) [0x7fe85ea53f30]
/usr/local/lib/libkaldiserve.so(kaldiserve::DecoderQueue::DecoderQueue(kaldiserve::ModelSpec const&)+0x237) [0x7fe85ea54b67]
./kaldi_serve_app(KaldiServeImpl::KaldiServeImpl(std::vector<kaldiserve::ModelSpec, std::allocator<kaldiserve::ModelSpec> > const&)+0xe1) [0x565374835c11]
./kaldi_serve_app(run_server(std::vector<kaldiserve::ModelSpec, std::allocator<kaldiserve::ModelSpec> > const&)+0x22) [0x56537483a4d2]
./kaldi_serve_app(main+0x661) [0x565374808851]
/lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf5) [0x7fe85e3b3b45]
./kaldi_serve_app(_start+0x2a) [0x5653748092ea]

ERROR ([5.5.0~1-da93]:ChainModel():model/model-chain.cpp:110) kaldi::KaldiFatalError

[ Stack-Trace: ]
/opt/kaldi/src/lib/libkaldi-base.so(kaldi::MessageLogger::LogMessage() const+0xa71) [0x7fe85b26b50f]
/usr/local/lib/libkaldiserve.so(+0xc7f4a) [0x7fe85ea1af4a]
/usr/local/lib/libkaldiserve.so(kaldiserve::ChainModel::ChainModel(kaldiserve::ModelSpec const&)+0x1e20) [0x7fe85ea58420]
/usr/local/lib/libkaldiserve.so(kaldiserve::DecoderFactory::DecoderFactory(kaldiserve::ModelSpec const&)+0x140) [0x7fe85ea53f30]
/usr/local/lib/libkaldiserve.so(kaldiserve::DecoderQueue::DecoderQueue(kaldiserve::ModelSpec const&)+0x237) [0x7fe85ea54b67]
./kaldi_serve_app(KaldiServeImpl::KaldiServeImpl(std::vector<kaldiserve::ModelSpec, std::allocator<kaldiserve::ModelSpec> > const&)+0xe1) [0x565374835c11]
./kaldi_serve_app(run_server(std::vector<kaldiserve::ModelSpec, std::allocator<kaldiserve::ModelSpec> > const&)+0x22) [0x56537483a4d2]
./kaldi_serve_app(main+0x661) [0x565374808851]
/lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf5) [0x7fe85e3b3b45]
./kaldi_serve_app(_start+0x2a) [0x5653748092ea]

Model not loaded: (somemodel,en). This raised an error:kaldi::KaldiFatalError
kaldi-serve gRPC Streaming Server listening on 0.0.0.0:5016

ListModels RPC responds with:

models:{name:"english-v5-new-structure"  language_code:"en"}

that is, the missing model is not present in the RPC response.

Calling Recognize with the missing model results in response:

error: <_InactiveRpcError of RPC that terminated with:
        status = StatusCode.UNKNOWN
        details = "[random] (KaldiServe Client) Recognize recognition config validation failed 
        ==>> ValidationError 
        ==>> (ERROR) model (somemodel) not supported by Kaldi. (recognition_config.model)"
        debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:5018 {grpc_message:"[random] (KaldiServe Client) Recognize recognition config validation failed \n\t==>> ValidationError \n\t==>> (ERROR) model (somemodel) not supported by Kaldi. (recognition_config.model)", grpc_status:2, created_time:"2023-05-23T15:12:14.014718+05:30"}"
>

@tanmay17061 tanmay17061 requested a review from pskrunner14 May 23, 2023 10:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants