Blog DeepPavlov

DeepPavlov Library 0.12.0 release

Products Updates Community

Announcing DeepPavlov 0.12.0 release


Hello everyone and welcome! It’s a pleasure for us to share our latest release with you.

This is an important milestone for the DeepPavlov framework, as in this version we’ve shipped support for PyTorch, Multi-task BERT - our latest contribution to the BERT community that we are so proud of, basic support for RASA.ai configs for building goal-oriented bots with our Go-Bot, and several other NLP components.

What’s included in the new DeepPavlov release?


  • DeepPavlov Library now supports both top Deep Learning  frameworks – TensorFlow and PyTorch. PyTorch is available for several DP models: torch trainer, torch base model class, torch text classifier, as well as a number of the torch BERT-based models: classifier, ranking, sequence tagger, squad, and summarizer.


  • Support for building Goal-Oriented Bots Using RASA config files. Our Go-Bot ML-driven framework enables developers to build their own chatbots. Prior to this release, building them involved creation of a DSTC2-compatible dataset, manually or using the community-contributed Automatic Dataset Generation Tool released in the previous version of DeepPavlov library. While the DSTC2 schema is rich and quite powerful, creating compatible dataset from the start can be challenging. Given the immense popularity of RASA (another popular Open Source Conversational AI framework) and simplicity of their domain-specific languages (DSLs) for configs, we have added a basic support for using their DSLs as a mechanism to define the behavior of goal-oriented bots built using the DeepPavlov Go-Bot framework. With these configs you are now able to define stories, intents, slots, and entities in RASA format to train your ML-based DeepPavlov’s Go-Bot. We encourage you to read the tutorial notebook to get a better understanding of how to build basic and more advanced goal-oriented bots with these RASA DSLs.

  • Entity Linking is now available as a standalone model (in addition to powering our KBQA model released in two previous versions). You can use this model to map entities in a given text to those found in WikiData. Currently, it is supported for WikiData in Russian language only. However, this model allows using both other languages and other Knowledge Graphs through custom implementations.

  • Top N answers in ODQA model – second contribution this time was made by mapryl, another member of our community. This is an improvement to our ODQA model’s API that allows developers to specify a number of results returned by the model. We are thankful to mapryl for her contribution to the DeepPavlov library!

  • New Hybrid NER Models Trained on OntoNotes for Russian and English languages.

  • Reader for BoolQ Dataset. With this improvement, you can develop BoolQ configs for yes/no question-answering for both Russian and English languages.

  • Other Improvements and Wrap Up. In addition to the aforementioned updates, this release also includes smaller improvements and fixes both in the product and in the documentation.



We encourage you to begin building your Conversational AI systems with our DeepPavlov Library on Github and let us know what you think! Feel free to test our BERT-based models by using our demo. And keep in mind that we have a dedicated forum, where any questions concerning the framework and the models are welcome.