in

Intel Acquires Model Optimizer SigOpt

Intel Acquires Model Optimizer SigOpt

Intel Corp. is acquiring AI optimization software vendor SigOpt, a move the chip maker said would complement its existing AI software portfolio while integrating SigOpt’s tools with its AI hardware to accelerate and scale AI software used by model developers.

The acquisition also addresses the growing complexity of machine learning and neural network models and the resulting inability of hardware to keep pace.

Terms of the transaction announced Thursday (Oct. 29) were not disclosed. Intel (NASDAQ: INTC) said it expects the acquisition to close by the end of this quarter.

San Francisco-based SigOpt’s co-founders and brain trust, CEO Scott Clark and CTO Patrick Hayes, will join Intel’s machine learning team.

SigOpt was founded in 2014 to create a commercial product from Clark’s academic research at Cornell University on Bayesian optimization techniques. Combined with Intel’s AI computing and machine learning capabilities, Clark said SigOpt’s optimization software would help “unlock entirely new AI capabilities for modelers.”

SigOpt’s AI software is designed to boost productivity and performance across hardware and software parameters, resulting in more accurate and better performing machine learning models—even as complexity grows.

“SigOpt’s AI software platform and data science talent will augment Intel software, architecture, product offerings and teams,” said Raja Koduri, Intel’s senior vice president and general manager of architecture, graphics and software.

The startup previously attracted the attention of In-Q-Tel, the investment arm of U.S. intelligence agencies, which eventually acquired a stake in the AI software developer.

Among the company’s strengths is its focus on metrics used to improve the performance of machine learning models.

The SigOpt deal therefore addresses concerns raised last year by Naveen Rao, vice president and general manager of Intel’s AI Products Group. Neural networks have grown so big, Rao noted, with so many parameters to calculate, that AI hardware is unable to keep up.

“The trend to be aware of is that the number of parameters–call this the complexity of the model,” Rao said. “The number of parameters in a neural network model is actually increasing on the order of 10x year-on-year. This is an exponential that I’ve never seen before,” Roa noted during Intel’s most recent AI summit.

“AI is driving the compute needs of the future,” Intel’s Koduri added in announcing the SigOpt deal. “It is even more important for software to automatically extract the best compute performance while scaling AI models.”

Source: www.datanami.com

What do you think?

48 points
Upvote Downvote

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Facebook AI Says Class Selectivity In Neurons May Impair DNN Performance

Facebook AI Says Class Selectivity In Neurons May Impair DNN Performance

Big data, machine learning shed light on Asian reforestation successes

Big data, machine learning shed light on Asian reforestation successes