When IBM announced in May 2023 that its generative artificial intelligence tools were in use by its AI assistant, watsonx, to help enterprises modernize their mainframe applications by translating COBOL code into Java, it was just the beginning of its efforts to leverage generative AI for developers.
The company has taken another leap forward with the announcement on Monday of the watsonx Code Assistant for general purpose coding, based on the company’s newly announced foundation model, Granite 3.0.
When IBM unveiled Granite in September 2023, it touted it as AI for business that applies generative AI to both language and code.
Monday’s announcement at the IBM TechXchange conference in Las Vegas, Nevada, provides users with seven new models: Granite-3.0-8B-Instruct and Granite-3.0-2B-Instruct language models; Granite-Guardian-3.0-8B and Granite-Guardian-3.0-2B, trained to detect jailbreaking, bias, violence, profanity, sexual content, and unethical behavior; Granite-3.0-8B-Instruct-Accelerator to speed inference and improve latency; and Granite-3.0-3B-A800M-Instruct and Granite-3.0-1B-A400M-Instruct Mixture of Experts (MoE) models. All are released under the Apache2 license, which, said Dario Gil, SVP and director of research at IBM, is “the most permissive.”
That move pleases analysts. “The deep commitment to AI open source by IBM is significant,” said Dion Hinchcliffe, VP and practice lead, CIO, at The Futurum Group. “Offering indemnification against certain legal liabilities and transparency differentiates IBM from competitors who may provide more opaque models or restrictive licenses.”
This month, a number of vendors’ products will support at least some of the Granite 3.0 models. IBM said that watsonx will include Granite 3.0 Language and Guardian models; Nvidia NIM receives Language, Guardian, and MoE; Hugging Face will include Language, Guardian, Accelerators, and MoE; and Ollama gets Language and MoE.
IBM said that Granite Language will also be released for Red Hat Enterprise Linux (RHEL) AI and OpenShift AI in December.
Analyst Patrick Moorhead is impressed. “While I was skeptical at first, the more research I did leads me to say with confidence, for enterprises, these are competitive if not superior models,” said Moorhead, CEO and chief analyst at Moor Insights & Strategy. “Prior models were more efficient but weren’t as accurate, but now the models are both more accurate and more efficient.” To be completely convinced, Moorhead said, he will want to see enterprises successfully using the Granite models. “I think enterprises will watch IBM and their string of developments, and if they keep seeing innovations, they will lock in,” he said.
116 languages
The version of Granite in IBM’s new watsonx Code Assistant has been trained on 116 programming languages, said Keri Olson, VP product management, AI for Code, at IBM. While seven languages — Java, Python, C, C++, Go, JavaScript, and TypeScript — received the most attention based on customer demand, she emphasized that this is only the beginning. Other languages such as Rust, C#, and RPG, are in line for that top-tier treatment.
“We’re certainly not going to stop there. We will continue to build on that with additional programming languages coming into our Tier One category. And then we will also continue to build out our integrations across developer tools,” Olson said.
At launch IBM will support watsonx Code Assistant in VS Code and Eclipse IDEs, with others under consideration. The company is also looking at bringing in agentic capabilities.
“We are not there today, but we are definitely thinking about those agentic flows and how we can get more autonomic work going in the future,” she said.
Additionally, she noted, since Java is one of the key languages in the new offering, IBM decided not to release the standalone watsonx Code Assistant for Enterprise Java that it announced in May.
Hinchcliffe likes what he’s hearing. “IBM is delivering advanced tools like code assistants supporting multiple languages and deeper integration with enterprise data. The increase in programming languages covered and enhancements in accuracy and performance are very welcome and help them push the envelope compared to other coding copilots,” he said. “Developers — and especially IT departments short on such increasingly rarified skills — will appreciate the enhanced support for legacy languages like COBOL and Java, which are still critical in enterprise environments. The ability to fine-tune AI models using internal data without external dependencies, keeping data within the company, will also resonate well with industries with strict compliance needs.”
For internal use
IBM has been using the new assistant internally since early this year, Olson said, and has found several compelling use cases. “I think that’s one of the most exciting things for developers; once they see that it can help them to take care of some of those tasks that maybe they’re not as interested in doing and allows them to focus on deeper thought and strategy and innovation, it really helps them.”
For example, she said, a small team of veteran and recently hired product developers inherited code repositories containing about 750 undocumented JavaScript files.
“The team was setting out to understand, ‘What is this code? What does it do?’ And they started with a PoC [proof of concept] where they used watsonx Code Assistant to document about 1000 lines of code across nine different files,” she said. “Manually, it took them about three minutes per file to document the code. I think that that was actually pretty fast, but when they used watsonx Code Assistant, they were able to understand and document the content of each file in about 12 seconds. So if you think about each file going from a three-minute exercise to a 12-second exercise, that is over a 90% time savings just for this specific task.”
They then checked watsonx’s work, and found it was about 90% correct, she said.
Code documentation is an often overlooked but nevertheless valuable use case for coding assistants, Olson said.
Another unexpected use is in test case generation. “It is definitely an area that developers do not tend to gravitate toward,” Olson said.
And, she added, there’s an integrated chat function in the new watsonx Code Assistant. “With the integrated chat function, you can simply use natural language to ask a question of the coding assistant, or to ask it to do something on your behalf, including, ‘please write this code for me.’ And there are definitely many cases internally that our internal clients have used it for code generation using natural language,” she said.
Augment developers, not replace them
However, she stressed, IBM watsonx Code Assistant is not meant to replace developers. “I really think it’s important to understand that the developer is still at the center of development and is still at the center of everything in the software development life cycle,” she said. “We are not at a point where we believe that AI can completely take over and do the job for the developer. And so this is an assistant, and should be used as such.”
Moorhead agreed.
“Those who study history know that these fears of net job loss are historically overblown,” he said. “What typically happens is that those job functions do more, higher-level work. When programming languages emerged, there were fears that all the machine coders would lose their jobs. They didn’t. They learned COBOL. I believe that this will repeat itself here.”
However, Hinchcliffe noted, despite this, some developers will still worry. “The concern over AI replacing developer jobs is quite real now, and while IBM makes a conscious effort to stress the ‘assistive’ nature of these tools, the path is sometimes fairly clear towards almost entirely replacing certain IT workers, especially in operations and basic dev maintenance tasks. Though IBM’s approach does focus on enabling developers — emphasizing ‘keeping them in the loop’ — rather than supplanting them, the broader narrative about AI in the job market will likely overshadow this.”
He’s also concerned about IBM’s timelines. “While IBM has made significant strides, there is still some uncertainty regarding their timeline for agents and their practical use,” he said. “IBM is honest about the fact that this space is still developing, which is refreshing, but may come across as underwhelming given the current hype around AI. Developers might be expecting more breakthrough capabilities sooner rather than later. Overall, however, IBM’s approach is well thought out and focused on what enterprises need today, but the messaging around augmentation will need to be clearer to overcome fears of job displacement.”
Go to Source
Author: