close
close

Coding is the most important application of GenAI right now, says Naveen Rao, head of AI at Databricks

Coding is the most important application of GenAI right now, says Naveen Rao, head of AI at Databricks

“We are seeing a lot of developers, especially junior developers, really embrace this. LLMs (large language models) have also worked on code (as well as text, images, audio, video, etc.) and we are already seeing design automation happening through coding assistance. This (coding) is a subclass of a much larger use case around true design automation,” Rao said. Mint In a recent video interview from his San Diego office.

Rao argues that as many coding tasks become automated, the ability to innovate in product design and create unique user experiences will become even more valuable.

“You can now create apps by just describing them in English. As a result, the value of translating a design idea into an app has diminished because much of that process is now automated,” he says.

Becoming a developer will mean using AI tools effectively, Rao said, adding that the focus will shift to understanding why some applications succeed while others fail.

Rao emphasizes that “it will take 3-5 years to ensure that these systems are reliable and accurate enough to be used in core engineering.” This is because GenAI still struggles with “hallucinations” (false or misleading results) and lacks real reasoning capabilities, despite claims to the contrary by some major tech companies.

Data is taking over the world through artificial intelligence

“Current LLMs primarily do probabilistic pattern matching, not logic. Unlike humans or even animals, who learn by action and feedback loops, LLMs do not engage in causal, real-world learning. While advances such as the step-by-step reasoning introduced by OpenAI are promising, there is still a long way to go,” says Rao, who holds a bachelor’s degree in computer science from Duke University and a doctorate in computational neuroscience from Brown University.

He is also a serial entrepreneur, having founded artificial intelligence companies Nervana Systems, which Intel Corp. acquired for about $400 million in 2016, and MosaicML, which was acquired by Databricks for $1.3 billion last July.

What keeps Rao up at night is the gap between AI’s potential and its current limitations, particularly in reasoning. AI models like DeepMind’s AlphaZero, which hypothesizes and learns through self-play in structured environments, provide a glimpse of what’s possible, he says, “but we’re a long way from implementing that in the real world.”

“We need to better understand how we can improve the ability of models to reason and learn in dynamic environments,” Rao says.

But that reasoning effort, he says, doesn’t require greater computational power. Instead, there’s a shift toward smaller, higher-quality datasets and more targeted tuning that might not require the massive computational resources associated with training larger models.

The shift from mass offline training to online learning could change how we approach AI development going forward, Rao says.

Why software won’t eat the AI ​​world

Rao also claims that software will not eat the AI ​​world, citing a famous 2011 article by Marc Andreessen, co-founder of US VC firm Andreessen Horowitz, titled “Why software is eating the world.” In a thread on ‘X’ (formerly Twitter) on August 30, he wrote, Rao argued “…the fundamental balance of computation and software is different in AI… In light of @nvidia’s strong growth and recent earnings, it’s clear that hardware is a key component of the current wave of AI…”

Rao says he’s “more excited about the ongoing evolution of hardware, which continues to increase efficiency and lower costs.” He points to the human brain, which runs on just 20 watts of power, as an example. “It shows how far we are from creating efficient and advanced AI systems.” He adds that significant advances have been made in how AI systems interact with hardware, improving latency, cost, and accuracy.

Databricks, for example, is exploring “several exciting applications of AI and GenAI, particularly in task automation and creative processes,” according to Rao. One area of ​​focus is automating tasks where “you can tolerate some error,” such as answering human resources (HR) queries or searching company manuals.

Now you can create your applications by defining them in English only.

Another innovative use case involves training LLMs to mimic a specific writing style, helping news organizations speed up content creation. Databricks also supports the use of LLMs in scientific research, such as drug discovery, where AI helps analyze protein interactions and advance new drug development.

Databricks was recognized as a leader in the 2024 Gartner Magic Quadrant for data science and machine learning platforms, ‘The Forrester Wave: Cloud Data Pipelines, Q4 2023’ and the 2024 IDC MarketScape for analytics stream processing worldwide.

Key customers using Databricks’ data intelligence platform to streamline their data, AI and analytics processes include Adobe, Aditya Birla Fashion & Retail, Mercedes-Benz Tech Innovation, Nasdaq, Air India, Parle, MakeMyTrip, Meesho, Tata Steel and Shell.

Data is the oil of AI

For example, Databricks’ DBRX model can transform raw data into a fully trained, fine-tuned model. Databricks is now focusing on “composite AI systems,” which combine multiple AI models, including open source and proprietary ones, to create advanced solutions, according to Rao. Using this approach, Databricks helped financial software provider Factset improve query accuracy and efficiency.

Data is a key ingredient for AI to really add economic value, so maybe the mantra now is: “Data is eating the world through AI,” Rao says.

However, many companies struggle to bridge the gap between data preparation and effective AI implementation. One of the key challenges, according to Rao, is understanding the economics of AI, which differs significantly from traditional software models such as Software-as-a-Service (SaaS), where many applications can run on a single piece of hardware.

AI models, on the other hand, require dedicated physical infrastructure for each additional user. According to Rao, this means that scaling AI will require costly hardware investments and lead to lower gross margins (typically below 20%) compared to SaaS. On the development side, rising costs make profitability difficult, especially as larger companies continue to raise capital and operate at a loss.

In this context, the key criterion for CXOs when integrating AI is to “define clear success criteria.” In AI, especially in specialized LLMs, this involves creating an assessment system, much like an exam, to measure the performance of the AI ​​system, says Rao.

He adds that Databricks offers a variety of optimization tools, such as reordering and fine-tuning models, that help improve performance. Creating and improving these metrics is crucial to successful AI deployment.

Rao points to Ola’s Krutrim, one of Databricks’ customers, which used the platform to build its own AI model instead of relying on pre-existing models. Other customers like Freshworks and Air India are using custom LLMs and composite AI systems to automate tasks like chatbots that help with questions about baggage policies or refunds.

Ola’s model is particularly notable due to India’s linguistic diversity, where multiple languages ​​such as Hindi, English and regional languages ​​are used together in daily communication, Ola said.

New skill sets

However, Rao agrees that AI will have a profound impact on jobs, especially given the increasing use of co-pilots and fully autonomous AI vehicle systems in businesses. Autonomous AI agents or so-called ‘Agent AI’ systemsIt refers to artificial intelligence models that can achieve certain goals without any human intervention.

“It’s clear that we need to figure out how humans will continue to be integrated into the process,” Rao says. Databricks is working on this by building tools that help customers build high-quality AI agents.

These agents go beyond simple tasks and create value by automating complex processes using customer data. But enterprise customers demand transparency, auditability, and security, making this a challenging space, according to Rao.

We need to better understand how to enable models to reason and learn in dynamic environments

“AI will change jobs dramatically, just as other technological advances have done in the past. As AI advances, key skills will include data engineering, system orchestration, product design, and AI tool use,” he adds.

Rao agrees that balancing innovation with responsible AI is also crucial, particularly around data privacy, bias, and transparency. He adds that Databricks excels at governance, with its Unity Catalog ensuring tight security across data and models.

While he notes that bias is application-specific and difficult to automate, he explains that Databricks offers filtering tools that allow companies to manage inputs and outputs, ensuring privacy, security, and content control. However, these tools are designed to be flexible and customizable to suit a variety of customer needs.

Do we need an AI Manager?

Given the complexity of the field, do businesses need an AI Chief Executive Officer? Especially considering that many large organizations already have C-suite roles like a chief information officer (CIO), chief technology officer (CTO), chief data officer, chief digital officer, and even a chief marketing officer (CMO) who oversee a bunch of AI functions.

Rao agrees, noting that companies have reacted to the AI ​​boom by rushing to integrate AI and hiring for the role without clear boundaries, leading to conflicts over budgets and responsibilities. A more practical solution, he says, would be to combine data and AI in a Chief Data and AI Officer role, separating those duties from the broader productivity-focused role of a CIO.

Lastly, but most importantly, Rao says that Artificial General Intelligence (AGI) is often misinterpreted. Some define it by how much human productivity it can replace, but true AGI would need to interact with the world, learn from its actions, and adapt—something far beyond current technology, he explains. While AGI remains a distant goal, he suggests that companies should focus on how AI can improve business processes and customer experiences today.