×
Data Storage

Nvidia Wants To Speed Up Data Transfer By Connecting Data Center GPUs To SSDs (arstechnica.com) 15

Microsoft brought DirectStorage to Windows PCs this week. The API promises faster load times and more detailed graphics by letting game developers make apps that load graphical data from the SSD directly to the GPU. Now, Nvidia and IBM have created a similar SSD/GPU technology, but they are aiming it at the massive data sets in data centers. From a report: Instead of targeting console or PC gaming like DirectStorage, Big accelerator Memory (BaM) is meant to provide data centers quick access to vast amounts of data in GPU-intensive applications, like machine-learning training, analytics, and high-performance computing, according to a research paper spotted by The Register this week. Entitled "BaM: A Case for Enabling Fine-grain High Throughput GPU-Orchestrated Access to Storage" (PDF), the paper by researchers at Nvidia, IBM, and a few US universities proposes a more efficient way to run next-generation applications in data centers with massive computing power and memory bandwidth. BaM also differs from DirectStorage in that the creators of the system architecture plan to make it open source.
Programming

The Dangers of CS 'Philanthrocapitalism' (freedom-to-tinker.com) 41

Princeton University has a research center studying "digital technologies in public life," which runs a web site with commentary and analysis "from the digital frontier, written by the Center's faculty, students, and friends."

Long-time Slashdot reader theodp summarizes the site's recent warning on the dangers of "philanthrocapitalism," in a piece noting ominously that "The tech industry controls CS conference funding." "Research about the influence of computing technologies, such as artificial intelligence (AI), on society relies heavily upon the financial support of the very companies that produce those technologies," writes Princeton Research Fellow Klaudia Jazwinska of the dangers of 'philanthrocapitalism'. "Corporations like Google, Microsoft, and IBM spend millions of dollars each year to sponsor labs, professorships, PhD programs, and conferences in fields like computer science (CS) and AI ethics at some of the world's top institutions. Industry is the main consumer of academic CS research, and 84% percent of CS professors receive at least some industry funding."

"Relying on large companies and the resources they control can create significant limitations for the kinds of CS research that are proposed, funded and published. The tech industry plays a large hand in deciding what is and isn't worthy of examination, or how issues are framed. [...] The scope of what is reasonable to study is therefore shaped by what is of value to tech companies. There is little incentive for these corporations to fund academic research about issues that they consider more marginal or which don't relate to their priorities."

Jazwinska concludes, "Given the extent of financial entanglement between Big Tech and academia, it might be unrealistic to expect CS scholars to completely resist accepting any industry funding—instead, it may be more practicable to make a concerted effort to establish higher standards for and greater transparency regarding sponsorship.

Red Hat Software

Red Hat Is Discontinuing Sales and Services In Russia and Belarus (newsobserver.com) 49

Red Hat, the Raleigh-based open-source software company, said Tuesday it is halting all sales and services to companies in Russia and Belarus -- a response to the Russian invasion of Ukraine that has put Red Hat employees in harm's way. Raleigh News & Observer reports: Paul Cormier, Red Hat's chief executive officer, announced the decision in an email to employees, saying: "As a company, we stand in unity with everyone affected by the violence and condemn the Russian military's invasion of Ukraine." Red Hat's announcement comes a day after its parent company, IBM, which also has a large presence in the Triangle, suspended all business operations in Russia.

"While relevant sanctions must guide many of our actions, we've taken additional measures as a company," Cormier wrote. "Effective immediately, Red Hat is discontinuing sales and services in Russia and Belarus (for both organizations located in or headquartered in Russia or Belarus)." Red Hat said it has approximately two dozen employees in Ukraine, which has become an important tech hub in Eastern Europe in recent years. It is home to tens of thousands of contractors and employees for U.S. firms. In his email, Cormier said that Red Hat has helped dozens of employees and family members in Ukraine relocate to safer locations. Many of them have gone to neighboring Poland, he noted. [...] However, Ukraine has barred men ages 18 to 60 from leaving the country, meaning many of Red Hat's employees can't be relocated from the country. We "continue to help those who remain in the country in any way possible," Cormier wrote.

Science

Physicists Produce Biggest Time Crystal Yet (science.org) 38

sciencehabit shares a report from Science.org: Physicists in Australia have programmed a quantum computer half a world away to make, or at least simulate, a record-size time crystal -- a system of quantum particles that locks into a perpetual cycle in time, somewhat akin to the repeating spatial pattern of atoms in an actual crystal. The new time crystal comprises 57 quantum particles, more than twice the size of a 20-particle time crystal simulated last year by scientists at Google. That's so big that no conventional computer could simulate it, says Chetan Nayak, a condensed matter physicist at Microsoft, who was not involved in the work. "So that's definitely an important advance." The work shows the power of quantum computers to simulate complex systems that may otherwise exist only in physicists' theories.

[Philipp Frey and Stephan Rachel, theorists at the University of Melbourne] performed the simulation remotely, using quantum computers built and run by IBM in the United States. The qubits, which can be set to 0, 1, or 1 and 0 at once, can be programmed to interact like magnets. For certain settings of their interactions, the researchers found, any initial setting of the 57 qubits, such as 01101101110 ..., remains stable, returning to its original state every two pulses, the researchers report today in Science Advances. [...] Whereas more than 100 researchers worked on the Google simulation, Frey and Rachel worked alone to perform their larger demonstration, submitting it to the IBM computers over the internet. "It was just me, my graduate student, and a laptop," Rachel says, adding that "Philipp is brilliant!" The entire project took about 6 months, he estimates. The demonstration isn't perfect, Rachel says. The flipping pattern ought to last indefinitely, he says, but the qubits in IBM's machines can only hold their states long enough to simulate about 50 cycles. Ultimately, the stabilizing effect of the interactions might be used to store the state of a string of qubits in a kind of memory for a quantum computer, he notes, but realizing such an advance will take -- what else? -- time.

AI

100 Billion Face Photos? Clearview AI tells investors it's On Track to Identify 'Almost Everyone in the World' (msn.com) 77

tThe Washington Post reports: Clearview AI is telling investors it is on track to have 100 billion facial photos in its database within a year, enough to ensure "almost everyone in the world will be identifiable," according to a financial presentation from December obtained by The Washington Post.

Those images — equivalent to 14 photos for each of the 7 billion people on Earth — would help power a surveillance system that has been used for arrests and criminal investigations by thousands of law enforcement and government agencies around the world. And the company wants to expand beyond scanning faces for the police, saying in the presentation that it could monitor "gig economy" workers and is researching a number of new technologies that could identify someone based on how they walk, detect their location from a photo or scan their fingerprints from afar.

The 55-page "pitch deck," the contents of which have not been reported previously, reveals surprising details about how the company, whose work already is controversial, is positioning itself for a major expansion, funded in large part by government contracts and the taxpayers the system would be used to monitor. The document was made for fundraising purposes, and it is unclear how realistic its goals might be. The company said that its "index of faces" has grown from 3 billion images to more than 10 billion since early 2020 and that its data collection system now ingests 1.5 billion images a month.

With $50 million from investors, the company said, it could bulk up its data collection powers to 100 billion photos, build new products, expand its international sales team and pay more toward lobbying government policymakers to "develop favorable regulation."

The article notes that major tech companies like Amazon, Google, IBM and Microsoft have all limited or ended their own sales of facial recognition technology — adding that Clearview's presentation simple describes this as a major business opportunity for themselves.

In addition, the Post reports Clearview's presentation brags "that its product is even more comprehensive than systems in use in China, because its 'facial database' is connected to 'public source metadata' and 'social linkage' information."
Chromium

Otter Browser Aims To Bring Chromium To Decades-Old OS/2 Operating System (xda-developers.com) 54

"The OS/2 community is getting close to obtaining a modern browser on their platform," writes Slashdot reader martiniturbide. In an announcement article on Monday, president of the OS/2 Voice community, Roderick Klein, revealed that a public beta of the new Chromium-based Otter Browser will arrive "in the last week of February or the first week of March." XDA Developers reports: OS/2 was the operating system developed jointly by IBM and Microsoft in the late 1980s and early 1990s, with the intended goal of replacing all DOS and Windows-based systems. However, Microsoft decided to focus on Windows after the immense popularity of Windows 3.0 and 3.1, leaving IBM to continue development on its own. IBM eventually stopped working on OS/2 in 2001, but two other companies licensed the operating system to continue where IBM left off -- first eComStation, and more recently, ArcaOS.

BitWise Works GmbH and the Dutch OS/2 Voice foundation started work on Otter Browser in 2017, as it was becoming increasingly difficult to keep an updated version of Firefox available on OS/2 and ArcaOS. Firefox 49 ESR from 2016 is the latest version available, because that's around the time Mozilla started rewriting significant parts of Firefox with Rust code, and there's no Rust compiler for OS/2. Since then, the main focus has been porting Qt 5.0 to OS/2, which includes the QtWebEngine (based on Chromium). This effort also has the side effect of making more cross-platform ports possible in the future.

IBM

IBM Cloud To Offer Z-Series Mainframes For First Time (theregister.com) 38

The 111-year-old tech institution today announced it will offer the Z mainframe platform on the IBM Cloud, by offering virtual machines running z/OS as-a-service. The Register reports: These VMs are intended for mainframe test and development environments, rather than have Big Blue care for and feed virtual production mainframes in the cloud for you. The service will be tied to Wazi -- an IBM development environment for mainframe applications. Test and dev was one of the first workloads suggested as an ideal candidate to run in the cloud. Before elastic infrastructure-as-a-service, organizations often found themselves building and operating replicas of their production stacks for their developers. Renting such environments as and when needed in the cloud was often -- and often remains -- cheaper than owning and operating the necessary infrastructure.

This infrastructure-as-a-service offering is therefore pitched as a way to reduce the time and resources required to develop mainframe applications. IBM said the new offering is currently a "closed experimental" technology -- we think that means closed beta. It's certainly not mentioned in the catalog of the IBM Cloud account your correspondent maintains, so information on cost or specs is not available at the time of writing. The service will become generally available in the second half of 2022 -- after IBM's 112th birthday.

IBM

Making 'Dinobabies' Extinct: IBM's Push for a Younger Work Force (nytimes.com) 73

In recent years, former IBM employees have accused the company of age discrimination in a variety of legal filings and press accounts, arguing that IBM sought to replace thousands of older workers with younger ones to keep pace with corporate rivals. From a report: Now it appears that top IBM executives were directly involved in discussions about the need to reduce the portion of older employees at the company, sometimes disparaging them with terms of art like "dinobabies." A trove of previously sealed documents made public by a Federal District Court on Friday show executives discussing plans to phase out older employees and bemoaning the company's relatively low percentage of millennials. The documents, which emerged from a lawsuit contending that IBM engaged in a yearslong effort to shift the age composition of its work force, appear to provide the first public piece of direct evidence about the role of the company's leadership in the effort.

"These filings reveal that top IBM executives were explicitly plotting with one another to oust older workers from IBM's work force in order to make room for millennial employees," said Shannon Liss-Riordan, a lawyer for the plaintiff in the case. Ms. Liss-Riordan represents hundreds of former IBM employees in similar claims. She is seeking class-action status for some of the claims, though courts have yet to certify the class. Adam Pratt, an IBM spokesman, defended the company's employment practices. "IBM never engaged in systemic age discrimination," he said. "Employees were separated because of shifts in business conditions and demand for certain skills, not because of their age." Mr. Pratt said that IBM hired more than 10,000 people over 50 in the United States from 2010 to 2020, and that the median age of IBM's U.S. work force was the same in each of those years: 48. The company would not disclose how many U.S. workers it had during that period.

IBM

Canada Will Get Its First Universal Quantum Computer From IBM (engadget.com) 32

An anonymous reader shares a report: Quantum computing is still rare enough that merely installing a system in a country is a breakthrough, and IBM is taking advantage of that novelty. The company has forged a partnership with the Canadian province of Quebec to install what it says is Canada's first universal quantum computer. The five-year deal will see IBM install a Quantum System One as part of a Quebec-IBM Discovery Accelerator project tackling scientific and commercial challenges. The team-up will see IBM and the Quebec government foster microelectronics work, including progress in chip packaging thanks to an existing IBM facility in the province. The two also plan to show how quantum and classical computers can work together to address scientific challenges, and expect quantum-powered AI to help discover new medicines and materials. IBM didn't say exactly when it would install the quantum computer. However, it will be just the fifth Quantum One installation planned by 2023 following similar partnerships in Germany, Japan, South Korea and the US. Canada is joining a relatively exclusive club, then.
Bitcoin

Quantum Computers Are a Million Times Too Small To Hack Bitcoin (newscientist.com) 61

MattSparkes shares a report from New Scientist: Quantum computers would need to become around one million times larger than they are today in order to break the SHA-256 algorithm that secures bitcoin, which would put the cryptocurrency at risk from hackers. Breaking this impenetrable code is essentially impossible for ordinary computers, but quantum computers, which can exploit the properties of quantum physics to speed up some calculations, could theoretically crack it open.

[Mark Webber at the University of Sussex, UK, and his colleagues] calculated that breaking bitcoin's encryption in this 10 minute window would require a quantum computer with 1.9 billion qubits, while cracking it in an hour would require a machine with 317 million qubits. Even allowing for a whole day, this figure only drops to 13 million qubits. This is reassuring news for bitcoin owners because current machines have only a tiny fraction of this -- IBM's record-breaking superconducting quantum computer has only 127 qubits, so devices would need to become a million times larger to threaten the cryptocurrency, something Webber says is unlikely to happen for a decade.
The study has been published in the journal AVS Quantum Science.
IBM

IBM Sells Some Watson Health Assets for More Than $1 Billion (bloomberg.com) 17

IBM agreed to sell part of its IBM Watson Health business to private equity firm Francisco Partners, scaling back the technology company's once-lofty ambitions in health care. From a report: The value of the assets being sold, which include extensive and wide-ranging data sets and products, and image software offerings, is more than $1 billion, according to people familiar with the plans. The deal "is a clear next step as IBM becomes even more focused on our platform-based hybrid cloud and AI strategy," said Tom Rosamilia, senior vice president, IBM Software. "IBM remains committed to Watson, our broader AI business, and to the clients and partners we support in healthcare IT." IBM launched Watson Health in 2015 with the aim of using its core artificial intelligence platform to help health care providers analyze troves of data and ultimately revolutionize cancer treatment. Many of the company's ambitions haven't panned out, though, and some customers have complained that its products didn't match the hype. Even after spending roughly $4 billion in acquisitions to prop up the initiative, Watson hasn't delivered the kind of progress IBM initially envisioned and the unit wasn't profitable. Last year, the Wall Street Journal reported the unit generated about $1 billion of annual revenue.
Input Devices

The Origin of the Blinking Cursor (inverse.com) 99

Long-time Slashdot reader jimminy_cricket shares a new article from the technology site Inverse exploring the origin of blinking cursors.

They trace the invention to the 1960s and electronics engineer Charles Kiesling, a naval veteran of the Korean War who "spent his immediate post-war years on a new challenge: the exploding computing age." Still decades away from personal computers — let alone portable ones — Kiesling was joining the ranks of engineers tinkering with room-sized computers like the IBM 650 or the aging ENIAC. He joined Sperry Rand, now Unisys, in 1955, and helped develop the kind of computer guts that casual users rarely think about. This includes innards like logic circuitry, which enable your computer to make complex conditional decisions like "or," "and," or "if only" instead of simply "yes" or "no". One of these seemingly innocuous advancements was a 1967 patent filing Kiesling made for a blinking cursor...."

According to a post on a computer science message board from a user purporting to be Kiesling's son, the inspiration for this invention was simply utility. "I remember him telling me the reason behind the blinking cursor, and it was simple," Kiesling's son writes. "He said there was nothing on the screen to let you know where the cursor was in the first place. So he wrote up the code for it so he would know where he was ready to type on the Cathode Ray Tube."

The blinking, it turns out, is simply a way to catch the coders' attention and stand apart from a sea of text.

The article credits Apple with popularizing blinking cursors to the masses. And it also remembers a fun story about Steve Jobs (shared by Thomas Haigh, a professor of technology history at the University of Wisconsin-Milwaukee): While he was in support of the blinking cursor itself, Haigh says Steve Jobs was famously against controlling it using cursor keys. Jobs attempted — and failed — to remove these keys from the original Mac in an effort to force users into using a mouse instead. In an interaction with biographer Walter Isaacson years later, he even pried them off with his car keys before signing his autograph on the keyboard.
IBM

IBM Tries To Sell Watson Health Again (axios.com) 17

IBM has resurrected its sale process for IBM Watson Health, with hopes of fetching more than $1 billion, people familiar with the situation told Axios. From the report: Big Blue wants out of health care, after spending billions to stake its claim, just as rival Oracle is moving big into the sector via its $28 billion bet for Cerner. IBM spent more than $4 billion to build Watson Health via a series of acquisitions. The business now includes health care data and analytics business Truven Health Analytics, population health company Phytel, and medical imaging business Merge Healthcare. IBM first explored a sale of the division in early 2021, with Morgan Stanley leading the process. WSJ reported at the time that the unit was generating roughly $1 billion in annual revenue, but was unprofitable. Sources say it continues to lose money.
Programming

'A Quadrillion Mainframes On Your Lap' (ieee.org) 101

"Your laptop is way more powerful than you might realize," writes long-time Slashdot reader fahrbot-bot.

"People often rhapsodize about how much more computer power we have now compared with what was available in the 1960s during the Apollo era. Those comparisons usually grossly underestimate the difference."

Rodney Brooks, emeritus professor of robotics at MIT (and former director of their AI Lab and CSAIL) explains in IEEE Spectrum: By 1961, a few universities around the world had bought IBM 7090 mainframes. The 7090 was the first line of all-transistor computers, and it cost US $20 million in today's money, or about 6,000 times as much as a top-of-the-line laptop today. Its early buyers typically deployed the computers as a shared resource for an entire campus. Very few users were fortunate enough to get as much as an hour of computer time per week.

The 7090 had a clock cycle of 2.18 microseconds, so the operating frequency was just under 500 kilohertz. But in those days, instructions were not pipelined, so most took more than one cycle to execute. Some integer arithmetic took up to 14 cycles, and a floating-point operation could hog up to 15. So the 7090 is generally estimated to have executed about 100,000 instructions per second. Most modern computer cores can operate at a sustained rate of 3 billion instructions per second, with much faster peak speeds. That is 30,000 times as fast, so a modern chip with four or eight cores is easily 100,000 times as fast.

Unlike the lucky person in 1961 who got an hour of computer time, you can run your laptop all the time, racking up more than 1,900 years of 7090 computer time every week....

But, really, this comparison is unfair to today's computers. Your laptop probably has 16 gigabytes of main memory. The 7090 maxed out at 144 kilobytes. To run the same program would require an awful lot of shuffling of data into and out of the 7090 — and it would have to be done using magnetic tapes . The best tape drives in those days had maximum data-transfer rates of 60 KB per second. Although 12 tape units could be attached to a single 7090 computer, that rate needed to be shared among them. But such sharing would require that a group of human operators swap tapes on the drives; to read (or write) 16 GB of data this way would take three days. So data transfer, too, was slower by a factor of about 100,000 compared with today's rate.

So now the 7090 looks to have run at about a quadrillionth (10 ** -15) the speed of your 2021 laptop. A week of computing time on a modern laptop would take longer than the age of the universe on the 7090.

Power

IBM and Samsung Say Their New Chip Design Could Lead To Week-Long Battery Life On Phones (theverge.com) 85

IBM and Samsung have announced their latest advance in semiconductor design: a new way to stack transistors vertically on a chip (instead of lying flat on the surface of the semiconductor). The Verge reports: The new Vertical Transport Field Effect Transistors (VTFET) design is meant to succeed the current FinFET technology that's used for some of today's most advanced chips and could allow for chips that are even more densely packed with transistors than today. In essence, the new design would stack transistors vertically, allowing for current to flow up and down the stack of transistors instead of the side-to-side horizontal layout that's currently used on most chips. Vertical designs for semiconductors have been a trend for a while (FinFET already offers some of those benefits); Intel's future roadmap also looks to move in that direction, too, although its initial work focused on stacking chip components rather than individual transistors. It makes sense, after all: when you've run out of ways to add more chips in one plane, the only real direction (other than physically shrinking transistor technology) is to go up.

While we're still a ways away from VTFET designs being used in actual consumer chips, the two companies are making some big claims, noting that VTFET chips could offer a "two times improvement in performance or an 85 percent reduction in energy use" compared to FinFET designs. And by packing more transistors into chips, IBM and Samsung claim that VTFET technology could help keep Moore's law's goal of steadily increasing transistor count moving forward. IBM and Samsung are also citing some ambitious possible use cases for the new technology, raising the idea of "cell phone batteries that could go over a week without being charged, instead of days," less energy-intensive cryptocurrency mining or data encryption, and even more powerful IoT devices or even spacecraft.

Technology

How Much Has Quantum Computing Actually Advanced? (ieee.org) 13

For a measured perspective on how much quantum computing is actually advancing as a field, IEEE Spectrum spoke with John Martinis, a professor of physics at the University of California, Santa Barbara, and the former chief architect of Google's Sycamore. From a report: IEEE Spectrum: So it's been about two years since you unveiled results from Sycamore. In the last few weeks, we've seen announcements of a 127-qubit chip from IBM and a 256-qubit neutral atom quantum computer from QuEra. What kind of progress would you say has actually been made?
John Martinis: Well, clearly, everyone's working hard to build a quantum computer. And it's great that there are all these systems people are working on. There's real progress. But if you go back to one of the points of the quantum supremacy experiment -- and something I've been talking about for a few years now -- one of the key requirements is gate errors. I think gate errors are way more important than the number of qubits at this time. It's nice to show that you can make a lot of qubits, but if you don't make them well enough, it's less clear what the advance is. In the long run, if you want to do a complex quantum computation, say with error correction, you need way below 1% gate errors. So it's great that people are building larger systems, but it would be even more important to see data on how well the qubits are working. In this regard, I am impressed with the group in China who reproduced the quantum supremacy results, where they show that they can operate their system well with low errors.

Intel

Intel's Expensive New Plan to Upgrade Its Chip Technology - and US Manufacturing (cnet.com) 131

America's push to manufacturer more products domestically gets an in-depth look from CNET — including a new Intel chip factory outside of Phoenix.

CNET calls it a fork in the road "after squandering its lead because of a half decade of problems modernizing its manufacturing..." With "a decade of bad decisions, this doesn't get fixed overnight," says Pat Gelsinger, Intel's new chief executive, in an interview. "But the bottom is behind us and the slope is starting to feel increasingly strong...." More fabs are on the way, too. In an enormous empty patch of dirt at its existing Arizona site, Intel has just begun building fabs 52 and 62 at a total cost of $20 billion, set to make Intel's most advanced chips, starting in 2024. Later this year, it hopes to announce the U.S. location for its third major manufacturing complex, a 1,000-acre site costing about $100 billion. The spending commitment makes this year's $3.5 billion upgrade to its New Mexico fab look cheap. The goal is to restore the U.S. share of chip manufacturing, which has slid from 37% in 1990 to 12% today. "Over the decade in front of us, we should be striving to bring the U.S. to 30% of worldwide semiconductor manufacturing," Gelsinger says...

But returning Intel to its glory days — and anchoring a resurgent U.S. electronics business in the process — is much easier said than done. Making chips profitably means running fabs at maximum capacity to pay off the gargantuan investments required to stay at the leading edge. A company that can't keep pace gets squeezed out, like IBM in 2014 or Global Foundries in 2018. To catch up after its delays, Intel now plans to upgrade its manufacturing five times in the next four years, a breakneck pace by industry standards. "This new roadmap that they announced is really aggressive," says Linley Group analyst Linley Gwennap. "I don't have any idea how they are going to accomplish all of that...."

Gelsinger has a tech-first recovery plan. He's pledged to accelerate manufacturing upgrades to match the technology of TSMC and Samsung by 2024 and surpass them in 2025. He's opening Intel's fabs to other companies that need chips built through its new Intel Foundry Services (IFS). And he's relying on other foundries, including TSMC, for about a quarter of Intel's near-term chipmaking needs to keep its chips more competitive during the upgrades. This three-pronged strategy is called IDM (integrated design and manufacturing) 2.0. That's a new take on Intel's philosophy of both designing and making chips. It's more ambitious than the future some had expected, in which Intel would sell its factories and join the ranks of "fabless" chip designers like Nvidia, AMD and Qualcomm that rely on others for manufacturing...

Shareholders may not like Gelsinger's spending-heavy strategy, but one community really does: Intel's engineers... Gelsigner told the board that Intel is done with stock buybacks, a financial move in which a company uses its cash to buy stock and thereby increase its price. "We're investing in factories," he told me. "That's going to be the use of our cash...."

"We cannot recall the last time Intel put so many stakes in the ground," said BMO Capital Markets analyst Ambrish Srivastava in a July research report after Intel announced its schedule.

Intel will even outpace Moore's law, Gelsinger tells CNET — more than doubling the transistor count on processors every two years. "I believe that you're going to see from 2025 to 2035 a very healthy period for Moore's Law-like behavior."

Although that still brings some risk to Intel's investments if they have to pass the costs on to customer, a Linley Group analyst points out to CNET. "Moore's Law is not going to end when we can't build smaller transistors. It's going to end when somebody says I don't want to pay for smaller transistors."
Supercomputing

Japan's Fugaku Retains Title As World's Fastest Supercomputer (datacenterdynamics.com) 13

According to a report from Nikkei Asia (paywalled), "The Japanese-made Fugaku captured its fourth consecutive title as the world's fastest supercomputer on Tuesday, although a rival from the U.S. or China is poised to steal the crown as soon as next year." From a report: But while Fugaku is the world's most powerful public supercomputer, at 442 petaflops, China is believed to secretly operate two exascale (1,000 petaflops) supercomputers, which were launched earlier this year. The top 10 list did not change much since the last report six months ago, with only one new addition -- a Microsoft Azure system called Voyager-EUS2. Voyager, featuring AMD Epyc CPUs and Nvidia A100 GPUs, achieved 30.05 petaflops, making it the tenth most powerful supercomputer in the world.

The other systems remained in the same position - after Japan's Arm-based Fugaku comes the US Summit system, an IBM Power and Nvidia GPU supercomputer capable of 148 petaflops. The similarly-architected 94 petaflops US Sierra system is next. Then comes what is officially China's most powerful supercomputer, the 93 petaflops Sunway TaihuLight, which features Sunway chips. The Biden administration sanctioned the company earlier this year.
You can read a summary of the systems in the Top10 here.
Intel

The Chip That Changed the World (wsj.com) 97

The world changed on Nov. 15, 1971, and hardly anyone noticed. It is the 50th anniversary of the launch of the Intel 4004 microprocessor, a computer carved onto silicon, an element as plentiful on earth as sand on a beach. Microprocessors unchained computers from air-conditioned rooms and freed computing power to go wherever it is needed most. Life has improved exponentially since. From a report: Back then, IBM mainframes were kept in sealed rooms and were so expensive companies used argon gas instead of water to put out computer-room fires. Workers were told to evacuate on short notice, before the gas would suffocate them. Feeding decks of punch cards into a reader and typing simple commands into clunky Teletype machines were the only ways to interact with the IBM computers. Digital Equipment Corp. sold PDP-8 minicomputers to labs and offices that weighed 250 pounds. In 1969, Nippon Calculating Machine Corp. asked Intel to design 12 custom chips for a new printing calculator. Engineers Federico Faggin, Stanley Mazor and Ted Hoff were tired of designing different chips for various companies and suggested instead four chips, including one programmable chip they could use for many products. Using only 2,300 transistors, they created the 4004 microprocessor. Four bits of data could move around the chip at a time. The half-inch-long rectangular integrated circuit had a clock speed of 750 kilohertz and could do about 92,000 operations a second.

Intel introduced the 3,500-transistor, eight-bit 8008 in 1972; the 29,000-transistor, 16-bit 8086, capable of 710,000 operations a second, was introduced in 1978. IBM used the next iteration, the Intel 8088, for its first personal computer. By comparison, Apple's new M1 Max processor has 57 billion transistors doing 10.4 trillion floating-point operations a second. That is at least a billionfold increase in computer power in 50 years. We've come a long way, baby. When I met Mr. Hoff in the 1980s, he told me that he once took his broken television to a repairman, who noted a problem with the microprocessor. The repairman then asked why he was laughing. Now that everyone has a computer in his pocket, one of my favorite movie scenes isn't quite so funny. In "Take the Money and Run" (1969), Woody Allen's character interviews for a job at an insurance company and his interviewer asks, "Have you ever had any experience running a high-speed digital electronic computer?" "Yes, I have." "Where?" "My aunt has one."

IBM

IBM Claims Quantum Computing Breakthrough (axios.com) 79

Axios reports: IBM has created a quantum processor able to process information so complex the work can't be done or simulated on a traditional computer, CEO Arvind Krishna told "Axios on HBO" ahead of a planned announcement.

Why it matters: Quantum computing could help address problems that are too challenging for even today's most powerful supercomputers, such as figuring out how to make better batteries or sequester carbon emissions.

Driving the news: IBM says its new Eagle processor can handle 127 qubits, a measure of quantum computing power. In topping 100 qubits, IBM says it has reached a milestone that allows quantum to surpass the power of a traditional computer. "It is impossible to simulate it on something else, which implies it's more powerful than anything else," Krishna told "Axios on HBO...."

Krishna says the quantum computing push is one part of his approach to return the company to growth.

"Some people think of it as science fiction," Krishna says of quantum computing in a preview clip from the Axios interview. "I think of it now as a feat of engineering."

Slashdot Top Deals