Context, codebase integrations, and model customization features enable teams with highly personalized code recommendations, explanations, documentation, and tests
Tabnine, the creators of the industry’s first AI-powered coding assistant for developers, announced new product capabilities that enable organizations to get more accurate and personalized recommendations based on their specific code and engineering patterns. Engineering teams can now increase Tabnine’s contextual awareness and quality of output by exposing it to their organization’s environment — both their local development environments and their entire code base — to receive code completions, code explanations, and documentation that are tailored to them.
Engineering teams face mounting challenges amidst ever-growing demands for new applications and features and continuing resource constraints on budgets and available hires. AI coding assistants offer a possible solution by boosting developer productivity and efficiency, yet the full potential of generative AI in software development relies upon further improving the relevance of their output for specific teams. The large language models (LLMs) that each AI coding assistant uses have been trained on vast amounts of data and contain billions of parameters, making them excellent at providing useful answers on a variety of topics. However, by exposing generative AI to the specific code and distinctive patterns of an organization, Tabnine is able to tailor recommendations around each development team, dramatically improving the quality of recommendations.
“Despite extensive training data, most AI coding assistants on the market today lack organization-specific context and domain knowledge, resulting in good but generic recommendations,” said Eran Yahav, co-founder and CTO of Tabnine. “Just as you need context to intelligently answer questions in real life, coding assistants also need context to intelligently answer questions. This is the driving force behind Tabnine’s new personalization capabilities, with contextual awareness to augment LLMs by providing all the subtle nuances that make each developer and organization unique.”
Tabnine’s mission is to accelerate and simplify the software development lifecycle through AI, and these new product capabilities allow customers to provide context to Tabnine about their environment and receive better, personalized results. This further boosts developer productivity, as it increases the quality of code generation and the acceptance rate for code completions and provides more informed answers around coding questions, code syntax, and structure.
Also Read: Decisions Announces One-click Insight Analysis with Process Mining for Rapid Optimization
Tabnine achieves context awareness and personalization in three distinct ways:
- Context through local code awareness: Tabnine can access locally available data in individual developers’ IDE to provide more accurate and relevant results.
- Connection to your software repository for global code awareness: Tabnine administrators can connect Tabnine to their organization’s code repositories, providing Tabnine access to a team’s entire codebase and significantly increasing the context that Tabnine uses to provide code recommendations, explain code, and generate tests and documentation.
- Customization of AI models: Building on the personalization of the AI assistant through context and connection, Tabnine continues to offer model customization to further enrich the capability and quality of the output. Enterprise engineering teams can benefit from a custom “Tabnine + You” model that fine tunes our universal model with a customer’s own code.
Tabnine continues to stay true to its values and delivers these highly personalized recommendations without compromising customer privacy; continuing to commit to advanced encryption and zero data retention for SaaS users and offering company codebase awareness within a completely private, customer-deployed version of Tabnine. The company uses retrieval augmented generation (RAG) to gather knowledge of each organization’s unique world. RAG is widely used in the industry and not only reduces LLM hallucinations, but also helps to overcome the inherent limitations of training data. Tabnine commits to never retain or share organizations’ code or data, ensuring privacy at all times.
Tabnine Chat Now in GA
Tabnine also announced today that Tabnine Chat — the enterprise-grade, code-centric chat application that allows developers to interact with Tabnine AI models using natural language — is now available to all users. Tabnine Chat extended the capability of Tabnine beyond code generation and into nearly every aspect of the software development lifecycle; supporting learning and research, aiding in the generation of tests, supporting code maintenance and bug fixing, generation of documentation, and more.
SOURCE: GlobeNewswire