Robin AI, the Artificial Intelligence with legal knowledge, has already reached 26 million dollars in its financing campaign, a substantial amount that will allow the company, through Series B, to advance in the progress of this AI that will provide a particularly useful service to legal professionals.
This Artificial Intelligence is designed to function as an extension of Microsoft Word, something that sets it apart from many other iterations focused on other services. In this way, Robin AI wants to focus the work of lawyers on consulting laws and regulations without the need to leave the document itself they are working on.
An AI specialized in laws
Robin AI, which can be integrated into Microsoft Word, will allow generating texts based on reliable and real legal content, something that is especially practical to avoid the famous hallucinations of AI. This system, based on Claude by Anthropic, will thus offer a specialized service that can be differential in the legal field.
Robin AI functions as a generative AI that, unlike other services like ChatGPT, will have a solid foundation in a specialized framework such as laws. In 2023, there were already several examples of why generative AIs with a general nature are unreliable when it comes to mentioning laws, as they can mention legal frameworks from other countries or continents, and even completely make up information.
Challenges of AI in 2024
After a year in which Artificial Intelligence has played a central role in the world of technology, 2024 focuses in this sense as a key year in which the latest advances must be consolidated to offer truly valuable uses. Iterations like GPT-4 or Google Gemini have shown great potential, but they still need to be exploited in a more efficient way. In addition, the progress of models like Robin AI allows to find useful specific approaches for certain niches.
On the other hand, there are several problems that AI itself raises in its convergence: the impact on the labor market, which can drastically reduce the workforce in highly qualified sectors; easy access to the creation of biased content, sometimes even committing crimes; and the environmental impact of AI itself, as it requires huge amounts of energy in the training and learning process.