New AI tools, new models, and new collaborations

New AI tools, new models, and new collaborations

Welcome back to the Circuit Breaker, where you can find the best recaps on the latest innovations in AI, quantum computing, semiconductors, and more, from across IBM Research and beyond.

This week's edition was created by Kim Martineau , Peter Hess , and Mike Murphy .


This week on Circuit Breaker: Week of March 30 - April 3

  • IBM and ETH Zurich’s latest collaboration 
  • Transparent supply chains for AI 
  • New Granite time-series models 


After 70 years, IBM and ETH Zurich open a new chapter 

Article content
The IBM Research facility in Zurich, Switzerland.

IBM Research and ETH Zürich are starting a new collaboration with the goal of creating the algorithmic foundation for the next era in computing. As part of the 10-year agreement, IBM will support new ETH Zurich professorships and joint research projects to train the next generation of innovative algorithm developers. 

We recently sat down with Alessandro Curioni , director of the IBM Research Zurich lab, to discuss IBM’s longstanding ties to ETH Zurich, dating back to the lab’s founding 70 years ago.  

  • We covered how the convergence of AI, classical computing, and quantum systems could enable scientific discoveries beyond the reach of today’s computers.  

“The future won’t be AI plus quantum, but AI times quantum because quantum computing will be broader than it is today,” Curioni says. “Quantum will be an enabling technology.” 

We also spoke with Curioni about his career at IBM Research Zurich, where he started as an intern and distinguished himself with two major prizes for his work in high performance scientific computing.

🎙️ Read the full interview  


Software often comes with a bill of materials. Should AI, too? 

Article content

Behind nearly every product is something called a BOM — a bill of materials that breaks down what’s in it and how it was made. The concept moved from physical products to software in the 2010s, initially to help developers sort through open-source licensing requirements.  

  • As malicious attacks grew more common, software BOMs (or SBOMs) expanded their scope.  
  • As AI takes over some traditional software roles, there’s a growing push to bring greater consistency and accountability to the way that data driven AI models are documented through an AIBOM. 

“If you look at any model card on Hugging Face, it's a README file which you can look at, but you can’t actually use programmatically without developing some parsing mechanism,” said Rakesh Jain, manager and chief architect of the data management platform behind IBM Granite. “Every model provider writes it differently.” 

As a first step toward a full AIBOM, IBM just released model and dataset metadata for its Granite 4.0 models in machine-readable JSON.  

  • The structured format allows model builders to generate model cards instantly and enterprises to compare models and check that those put into production meet internal safety and security policies, adding extra risk controls at runtime.  

Granite’s transition to structured disclosures brings a new level of transparency to the models, which hold the number one spot on Stanford’s Foundation Model Transparency Index and were among the first large language models to earn ISO 42001 certification.  

📝 Learn more about the industry’s move to AIBOMs 


Updating the Granite time-series suite of models 

Article content

Time-series AI models are becoming core to enterprises, but when working with time-series data, a one-size-fits-all solution is rarely the answer. A model that works well for financial analysis may not be as suited for monitoring industrial processes.  

  • Point forecasting, probabilistic forecasting, anomaly detection, and ultra‑fast inference all place very different demands on AI systems. 

That’s why IBM Research has released a newly updated family of time-series foundation models that cover the enterprise spectrum, each one optimized to meet the needs of distinct enterprise prediction problems:  

  • FlowState-r1.1 is for point forecasting. It can handle short and long inputs and forecast horizons, in addition to varied sampling rates. 
  • PatchTST-FM-r1 is a transformer-based model that can handle context lengths from 128 to 8,192 timepoints, and it excels at probabilistic forecasting. 
  • TTM-r3 is designed to balance speed and accuracy for efficient forecasting, anomaly detection, classification, and search. 

The models are all open weight, and the research versions are available under a non-commercial license. Released this week, they are already topping the GIFT-Eval leaderboard on Hugging Face.  

🕰️ Read more about their features and how to start using them 

Article content

The IBM Granite 4.0 3B Vision model is now live 


ChartNet: A new dataset for chart understanding from IBM Research 


Fast Company: IBM named one of the most innovative enterprise companies in the world 


IBM Think: "One giant leap for AI"


Check out IBM's VP for AI models David Cox’s new Substack 


Understanding control in LLM systems 

Article content

Highlighting new publications from IBM researchers that we liked the sound of:


If you liked this, please consider following IBM Research on LinkedIn. If you have any questions, whether they're atomically small or the size of a quantum computer, don't hesitate to reach out.

And if you want to go even deeper, subscribe to our monthly newsletter, Future Forward for more on the latest news on breakthroughs in AI, quantum computing and hybrid cloud.

Alessandro Bandera

SAE International1K followers

3w

News about IBM future agreement... ....IBM Research and Eidgenössische Technische Hochschule Zürich are starting a new collaboration with the goal of creating the algorithmic foundation for the next era in computing. As part of the 10-year agreement, IBM will support new ETH Zurich professorships and joint research projects to train the next generation of innovative algorithm developers.... Thank you for sharing

To view or add a comment, sign in

More articles by IBM Research

Explore content categories