• Cetus’ hack response on Sui was once successful…
  • BTC, ETH, XRP, BNB, SOL, DOGE, ADA, SUI,…
  • VeChain Appoints Former IBM and Deloitte Blockchain Exec…
  • US Dollar Crisis: Dollar Falls Amid Fiscal Fears
  • Cetus’ hack response on Sui was once successful…
  • BTC, ETH, XRP, BNB, SOL, DOGE, ADA, SUI,…
  • VeChain Appoints Former IBM and Deloitte Blockchain Exec…
  • US Dollar Crisis: Dollar Falls Amid Fiscal Fears
  • Cetus’ hack response on Sui was once successful…
  • BTC, ETH, XRP, BNB, SOL, DOGE, ADA, SUI,…
  • VeChain Appoints Former IBM and Deloitte Blockchain Exec…
  • US Dollar Crisis: Dollar Falls Amid Fiscal Fears
Lets Talk Web3 Your trusted source for all things Web3
  • Latest Post
    • Bitcoin News
    • Ethereum News
    • Altcoin News
    • Blockchain News
  • About Us
  • AI News
  • Press Release
  • NFT News
  • Market Analysis
☰
Lets Talk Web3

We also offer the following services:

👉Global Media Coverage: We secure top-tier media placements worldwide. Need specific media houses? Let’s discuss your targets.
👉Content Strategies & Management: From crafting compelling narratives to managing your content, we ensure your message resonates.
👉Shilling Services: Drive constant visibility with strategic Twitter and Binance Square posts.
👉Organic Engagement Boosters: Amplify your presence on Twitter and Telegram with authentic, organic engagement.
👉Exchange Listings: We facilitate smooth and strategic exchange listings to help you reach the right markets.
👉Performance Marketing: Target Web3-focused websites with precision marketing that delivers results.
👉KOL (Key Opinion Leader) Partnerships: With connections to over 5,000 KOLs across various platforms, we can craft a strategy that suits your audience and goals.

Block a time here- https://lnkd.in/g7iCgq_b or email at Contact@letstalkweb3.com

New o1 model of LLM at OpenAI could change hardware market

Nitin Gupta - AI News - November 28, 2024
New o1 model of LLM at OpenAI could change hardware market
Nitin Gupta Founder of LetsTalkWeb3.com, a full fledged media house for everything Web3.…
22 views 6 mins 0 Comments


OpenAI and other leading AI companies are developing new training techniques to overcome limitations of current methods. Addressing unexpected delays and complications in the development of larger, more powerful language models, these fresh techniques focus on human-like behaviour to teach algorithms to ‘think.

Reportedly led by a dozen AI researchers, scientists, and investors, the new training techniques, which underpin OpenAI’s recent ‘o1’ model (formerly Q* and Strawberry), have the potential to transform the landscape of AI development. The reported advances may influence the types or quantities of resources AI companies need continuously, including specialised hardware and energy to aid the development of AI models.

The o1 model is designed to approach problems in a way that mimics human reasoning and thinking, breaking down numerous tasks into steps. The model also utilises specialised data and feedback provided by experts in the AI industry to enhance its performance.

Since ChatGPT was unveiled by OpenAI in 2022, there has been a surge in AI innovation, and many technology companies claim existing AI models require expansion, be it through greater quantities of data or improved computing resources. Only then can AI models consistently improve.

Now, AI experts have reported limitations in scaling up AI models. The 2010s were a revolutionary period for scaling, but Ilya Sutskever, co-founder of AI labs Safe Superintelligence (SSI) and OpenAI, says that the training of AI models, particularly in the understanding language structures and patterns, has levelled off.

“The 2010s were the age of scaling, now we’re back in the age of wonder and discovery once again. Scaling the right thing matters more now,” they said.

In recent times, AI lab researchers have experienced delays in and challenges to developing and releasing large language models (LLM) that are more powerful than OpenAI’s GPT-4 model.

First, there is the cost of training large models, often running into tens of millions of dollars. And, due to complications that arise, like hardware failing due to system complexity, a final analysis of how these models run can take months.

In addition to these challenges, training runs require substantial amounts of energy, often resulting in power shortages that can disrupt processes and impact the wider electriciy grid. Another issue is the colossal amount of data large language models use, so much so that AI models have reportedly used up all accessible data worldwide.

Researchers are exploring a technique known as ‘test-time compute’ to improve current AI models when being trained or during inference phases. The method can involve the generation of multiple answers in real-time to decide on a range of best solutions. Therefore, the model can allocate greater processing resources to difficult tasks that require human-like decision-making and reasoning. The aim – to make the model more accurate and capable.

Noam Brown, a researcher at OpenAI who helped develop the o1 model, shared an example of how a new approach can achieve surprising results. At the TED AI conference in San Francisco last month, Brown explained that “having a bot think for just 20 seconds in a hand of poker got the same boosting performance as scaling up the model by 100,000x and training it for 100,000 times longer.”

Rather than simply increasing the model size and training time, this can change how AI models process information and lead to more powerful, efficient systems.

It is reported that other AI labs have been developing versions of the o1 technique. The include xAI, Google DeepMind, and Anthropic. Competition in the AI world is nothing new, but we could see a significant impact on the AI hardware market as a result of new techniques. Companies like Nvidia, which currently dominates the supply of AI chips due to the high demand for their products, may be particularly affected by updated AI training techniques.

Nvidia became the world’s most valuable company in October, and its rise in fortunes can be largely attributed to its chips’ use in AI arrays. New techniques may impact Nvidia’s market position, forcing the company to adapt its products to meet the evolving AI hardware demand. Potentially, this could open more avenues for new competitors in the inference market.

A new age of AI development may be on the horizon, driven by evolving hardware demands and more efficient training methods such as those deployed in the o1 model. The future of both AI models and the companies behind them could be reshaped, unlocking unprecedented possibilities and greater competition.

See also: Anthropic urges AI regulation to avoid catastrophes

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, a

Tags: artificial intelligence, machine learning, models



Source link

TAGS:
PREVIOUS
Best Amazon Black Friday deals: Early savings on Kindles, air fryers, and robot vacuums
NEXT
Ozonetel launches ‘CXi Switch’ for instant customer engagement
Related Post
AI and bots allegedly used to fraudulently boost music streams
September 17, 2024
AI and bots allegedly used to fraudulently boost music streams
Gap between two people illustrating the disparities in views on generative AI, according to new industry research by Publicis Sapient.
November 19, 2024
Disparities between C-suite and practitioners
OpenAI funds $1 million study on AI and morality at Duke University
December 23, 2024
OpenAI funds $1 million study on AI and morality at Duke University
AI hallucinations gone wrong as Alaska uses fake stats in policy
November 5, 2024
AI hallucinations gone wrong as Alaska uses fake stats in policy
Leave a Reply

Click here to cancel reply.

With a global network of contributors, LetsTalkWeb3 is committed to providing high-quality content that serves both newcomers and seasoned professionals. Whether you’re an investor, developer, or simply curious about the future of the internet, LetsTalkWeb3 is your trusted source for all things Web3

Scroll To Top
  • Home
  • About Us
  • AI News
  • Press Release
  • NFT News
  • Market Analysis
© Copyright 2025 - Lets Talk Web3 . All Rights Reserved
bitcoin
Bitcoin (BTC) $ 108,908.18
ethereum
Ethereum (ETH) $ 2,563.32
tether
Tether (USDT) $ 1.00
xrp
XRP (XRP) $ 2.36
bnb
BNB (BNB) $ 667.27
solana
Solana (SOL) $ 179.58
usd-coin
USDC (USDC) $ 1.00
dogecoin
Dogecoin (DOGE) $ 0.234847
cardano
Cardano (ADA) $ 0.776483
tron
TRON (TRX) $ 0.270045
bitcoin
Bitcoin (BTC) $ 108,908.18
ethereum
Ethereum (ETH) $ 2,563.32
tether
Tether (USDT) $ 1.00
xrp
XRP (XRP) $ 2.36
bnb
BNB (BNB) $ 667.27
solana
Solana (SOL) $ 179.58
usd-coin
USDC (USDC) $ 1.00
dogecoin
Dogecoin (DOGE) $ 0.234847
cardano
Cardano (ADA) $ 0.776483
tron
TRON (TRX) $ 0.270045