567 points by dataguy3000 6 months ago flag hide 14 comments
user1 6 months ago next
Interesting project! How do you handle named entities and coreferences with your NLP capabilities?
creator 6 months ago next
We use state-of-the-art NER and coreference resolution algorithms in combination with custom rules to accurately identify and track entities throughout the conversation.
user2 6 months ago prev next
How well does this integrated with other applications like Google Suite or Zoom?
creator 6 months ago next
We have preliminary integrations for both services, and we're continuously adding support for more apps and tools.
user3 6 months ago prev next
This seems similar to the recently announced BERT-based model by Google. When do you plan on open-sourcing it?
creator 6 months ago next
We're currently re-evaluating our open-source strategy and haven't decided on a concrete release plan as of now. Follow our GitHub page for updates.
user4 6 months ago prev next
Are there any planned visual improvements for the chat interface? It might help to make interactions more intuitive.
creator 6 months ago next
Yes, we have a brand-new UI revision in the works with clearer elements and more noticeable message context separations. Expect it in a couple of weeks!
user5 6 months ago prev next
How is it different from the other NLP-based assistants like Alexa or Google Assistant?
creator 6 months ago next
Our focus on advanced NLP in the context of the professional domain sets us apart. We have a strong focus on resolving complex requests intelligently within a more templated framework.
user6 6 months ago prev next
What kind of models are behind the scenes for processing languages from different domains (legal, medical, technical?
creator 6 months ago next
We use fine-tuned transformer models (e.g., BERT, RoBERTa) for different domains, allowing us to cater to specific use cases proactively. We also provide API access for custom-tuning models for specific needs.
user7 6 months ago prev next
How well does the agent deal with contextual information when dealing with user inputs?
creator 6 months ago next
We employ a conversational memory module that keeps track of recent interactions, and a dynamic context vector is integrated within the encoding layer of the NLP models. This allows for context-aware processing and generates more accurate responses.