Programming

AI Tackles Aging COBOL Systems as Legacy Code Expertise Dwindles 76

US government agencies and Fortune 500 companies are turning to AI to modernize mission-critical systems built on COBOL, a programming language dating back to the late 1950s. The US Social Security Administration plans a three-year, $1 billion AI-assisted upgrade of its legacy COBOL codebase [alternative source], according to Bloomberg.

Treasury Secretary Scott Bessent has repeatedly stressed the need to overhaul government systems running on COBOL. As experienced programmers retire, organizations face growing challenges maintaining these systems that power everything from banking applications to pension disbursements. Engineers now use tools like ChatGPT and IBM's watsonX to interpret COBOL code, create documentation, and translate it to modern languages.
IBM

IBM Orders US Sales To Locate Near Customers or Offices (theregister.com) 31

IBM is mandating that U.S. sales and Cloud employees return to the office at least three days a week, with work required at designated client sites, flagship offices, or sales hubs. According to The Register, some IBM employees argue that these policies "represent stealth layoffs because older (and presumably more highly compensated) employees tend to be less willing to uproot their lives, and families where applicable, than the 'early professional hires' IBM has been courting at some legal risk." From the report: In a staff memo seen by The Register, Adam Lawrence, general manager for IBM Americas, billed the return-to-office for most stateside sales personnel as a "return to client initiative."Citing how "remarkable it is when our teams work side by side" at IBM's swanky Manhattan flagship office, unveiled in September 2024, Lawrence added IBM is investing in an Austin, Texas, office to be occupied in 2026.

Whether US sales staff end up working in NYC, Austin, or some other authorized location, Lawrence told them to brace for -- deep breath -- IBM's "new model" of "effective talent acquisition, deployment, and career progression." We're told that model is "centered on client proximity for those dedicated to specific clients, and anchored on core IBM locations for those dedicated to territories or those in above-market leadership roles." The program requires most IBM US sales staff "to work at least three days a week from the client location where their assigned territory decision-makers work, a flagship office, or a sales hub." Those residing more than 50 miles from their assigned location will be offered relocation benefits to move. Sales hubs are an option only for those with more than one dedicated account.

[...] IBM's office policy change reached US Cloud employees in an April 10 memo from Alan Peacock, general manager of IBM Cloud. Peacock set a July 1, 2025, deadline for US Cloud employees to work from an office at least three days per week, with relocating workers given until October 1, 2025. The employee shuffling has been accompanied by rolling layoffs in the US, but hiring in India -- there are at least 10x as many open IBM jobs in India as there are in any other IBM location, according to the corporation's career listings. And earlier this week, IBM said it "is setting up a new software lab in Lucknow," India.

Businesses

US Stock Markets See Worst Day Since Covid Pandemic (theguardian.com) 225

U.S. stock markets suffered their worst day since the Covid pandemic after Donald Trump announced sweeping new tariffs, triggering a global selloff and wiping out $470 billion in value from tech giants Apple and Nvidia. From a report: The tech-heavy Nasdaq fell 6%, while the S&P 500 and the Dow dropped 4.8% and 3.9%, respectively. [...] Meanwhile, the US dollar hit a six-month low, going down at least 2.2% on Thursday morning compared with other major currencies and oil prices sank on fears of a global slowdown. Though the US stock market has been used to tumultuous mornings over the last few weeks, US stock futures -- an indication of the market's likely direction -- had plummeted after the announcement. Hours later, Japan's Nikkei index slumped to an eight-month low and was followed by falls in stock markets in London and across Europe.

Multiple major American business groups have spoken out against the tariffs, including the Business Roundtable, a consortium of leaders of major US companies including JP Morgan, Apple and IBM, which called on the White House to "swiftly reach agreements" and remove the tariffs. "Universal tariffs ranging from 10-50% run the risk of causing major harm to American manufacturers, workers, families and exporters," the Business Roundtable said in a statement. "Damage to the US economy will increase the longer the tariffs are in place and may be exacerbated by retaliatory measures."

IBM

IBM US Cuts May Run Deeper Than Feared - and the Jobs Are Heading To India (theregister.com) 76

The Register: Following our report last week on IBM's ongoing layoffs, current and former employees got in touch to confirm what many suspected: The US cuts run deeper than reported, and the jobs are heading to India. IBM's own careers site numbers back that up. On January 7, 2024, Big Blue listed just 173 open positions in India. On November 23, 2024, there were 2,946 jobs available in the nation. At the time of writing, the IT titan listed 3,866 roles in India.

American jobs listed for these three periods are 192, 376, and 333, respectively, though at least among those being laid off, there's doubt those roles will be filled with job seekers in the States. A current IBMer who won't be there much longer said that after being told to teach recently hired workers in India "everything I know," the reward was a resource action, or RA -- Big Blue's euphemism for a layoff. After receiving an RA notification, employees typically have a set period of time to apply for open roles elsewhere in the mega-corporation. But just because there are open positions listed in the US doesn't mean IBM is making much of an effort to fill them, we are told.

Education

America's College Board Launches AP Cybersecurity Course For Non-College-Bound Students (edweek.org) 26

Besides administering standardized pre-college tests, America's nonprofit College Board designs college-level classes that high school students can take. But now they're also crafting courses "not just with higher education at the table, but industry partners such as the U.S. Chamber of Commerce and the technology giant IBM," reports Education Week.

"The organization hopes the effort will make high school content more meaningful to students by connecting it to in-demand job skills." It believes the approach may entice a new kind of AP student: those who may not be immediately college-bound.... The first two classes developed through this career-driven model — dubbed AP Career Kickstart — focus on cybersecurity and business principles/personal finance, two fast-growing areas in the workforce." Students who enroll in the courses and excel on a capstone assessment could earn college credit in high school, just as they have for years with traditional AP courses in subjects like chemistry and literature. However, the College Board also believes that students could use success in the courses as a selling point with potential employers... Both the business and cybersecurity courses could also help fulfill state high school graduation requirements for computer science education...

The cybersecurity course is being piloted in 200 schools this school year and is expected to expand to 800 schools next school year... [T]he College Board is planning to invest heavily in training K-12 teachers to lead the cybersecurity course.

IBM's director of technology, data and AI called the effort "a really good way for corporations and companies to help shape the curriculum and the future workforce" while "letting them know what we're looking for." In the article the associate superintendent for teaching at a Chicago-area high school district calls the College Board's move a clear signal that "career-focused learning is rigorous, it's valuable, and it deserves the same recognition as traditional academic pathways."

Also interesting is why the College Board says they're doing it: The effort may also help the College Board — founded more than a century ago — maintain AP's prominence as artificial intelligence tools that can already ace nearly every existing AP test on an ever-greater share of job tasks once performed by humans. "High schools had a crisis of relevance far before AI," David Coleman, the CEO of the College Board, said in a wide-ranging interview with EdWeek last month. "How do we make high school relevant, engaging, and purposeful? Bluntly, it takes [the] next generation of coursework. We are reconsidering the kinds of courses we offer...."

"It's not a pivot because it's not to the exclusion of higher ed," Coleman said. "What we are doing is giving employers an equal voice."

Thanks to long-time Slashdot reader theodp for sharing the article.
Unix

Rebooting A Retro PDP-11 Workstation - and Its Classic 'Venix' UNIX (blogspot.com) 36

This week the "Old Vintage Computing Research" blog published a 21,000-word exploration of the DEC PDP-11, the 16-bit minicomputer sold by Digital Equipment Corporation. Slashdot reader AndrewZX calls the blog post "an excellent deep dive" into the machine's history and capabilities "and the classic Venix UNIX that it ran." The blogger still owns a working 1984 DEC Professional 380, "a tank of a machine, a reasonably powerful workstation, and the most practical PDP-adjacent thing you can actually slap on a (large) desk."

But more importantly, "It runs PRO/VENIX, the only official DEC Unix option for the Pros." In that specific market it was almost certainly the earliest such licensed Unix (in 1983) and primarily competed against XENIX, Microsoft's dominant "small Unix," which first emerged for XT-class systems as SCO XENIX in 1984. You'd wonder how rogue processes could be prevented from stomping on each other in such systems when neither the Intel 8086/8088 nor the IBM PC nor the PC/XT had a memory management unit, and the answer was not to try and just hope for the best. It was for this reason that IBM's own Unix variant PC/IX, developed by Interactive Systems Corporation under contract as their intended AT&T killer, was multitasking but single-user since in such an architecture there could be no meaningful security guarantees...

One of Venix's interesting little idiosyncrasies, seen in all three Pro versions, was the SUPER> prompt when you've logged on as root (there is also a MAINT> prompt when you're single-user...

Although Bill Gates had been their biggest nemesis early on, most of the little Unices that flourished in the 1980s and early 90s met their collective demise at the hands of another man: Linus Torvalds. The proliferation of free Unix alternatives like Linux on commodity PC hardware caused the bottom to fall out of the commercial Unix market.

The blogger even found a 1989 log for the computer's one and only guest login session — which seems to consist entirely of someone named tom trying to exit vi.

But the most touching part of the article comes when the author discovers a file named /thankyou that they're certain didn't come with the original Venix. It's an ASCII drawing of a smiling face, under the words "THANK YOU FOR RESCUING ME".

"It's among the last files created on the system before it came into my possession..."

It's all a fun look back to a time when advances in semiconductor density meant microcomputers could do nearly as much as the more expensive minicomputers (while taking up less space) — leaving corporations pondering the new world that was coming: As far back as 1974, an internal skunkworks unit had presented management with two small systems prototypes described as a PDP-8 in a VT50 terminal and a portable PDP-11 chassis.

Engineers were intrigued but sales staff felt these smaller versions would cut into their traditional product lines, and [DEC president Ken] Olsen duly cancelled the project, famously observing no one would want a computer in their home.

IBM

IBM Cuts Thousands of Jobs, Cloud Classic Unit Hit Hard: Report (theregister.com) 49

IBM is laying off thousands of employees across the United States, with approximately 25% of staff at its Cloud Classic operation affected, The Register reports, citing a source. "Concrete numbers are being kept private," a source told the publication. "It is in the thousands."

Staff reductions have occurred in Raleigh, North Carolina; New York; Dallas, Texas; and California, the report said. Affected departments include consulting, corporate social responsibility, cloud infrastructure, sales, and internal systems teams. The report adds: With regard to IBM Cloud Classic -- the infrastructure-as-a-service (IaaS) outfit offering built on IBM's 2013 acquisition of SoftLayer -- another source told us: "It's a resource action. I don't know how many people are in IaaS classic. They don't typically make that information easy to find. What I can say is that they have been making a lot of changes to shift employment to India as much as possible."

A third source, newly let go by Big Blue, said it was fair to characterize this a layoff. "Everyone I know that was affected, myself included, was simply offered a separation agreement," this individual said, estimating that 10 percent of the Cloud group (which is not the same as Cloud Classic) has been let go.

Programming

IBM CEO Doesn't Think AI Will Replace Programmers Anytime Soon (techcrunch.com) 58

IBM CEO Arvind Krishna has publicly disagreed with Anthropic CEO Dario Amodei's prediction that AI will write 90% of code within 3-6 months, estimating instead that only "20-30% of code could get written by AI."

"Are there some really simple use cases? Yes, but there's an equally complicated number of ones where it's going to be zero," Krishna said during an onstage interview at SXSW. He argued AI will boost programmer productivity rather than eliminate jobs. "If you can do 30% more code with the same number of people, are you going to get more code written or less?" he asked. "History has shown that the most productive company gains market share, and then you can produce more products."
Privacy

Thousands of Exposed GitHub Repositories, Now Private, Can Still Be Accessed Through Copilot (techcrunch.com) 19

An anonymous reader quotes a report from TechCrunch: Security researchers are warning that data exposed to the internet, even for a moment, can linger in online generative AI chatbots like Microsoft Copilot long after the data is made private. Thousands of once-public GitHub repositories from some of the world's biggest companies are affected, including Microsoft's, according to new findings from Lasso, an Israeli cybersecurity company focused on emerging generative AI threats.

Lasso co-founder Ophir Dror told TechCrunch that the company found content from its own GitHub repository appearing in Copilot because it had been indexed and cached by Microsoft's Bing search engine. Dror said the repository, which had been mistakenly made public for a brief period, had since been set to private, and accessing it on GitHub returned a "page not found" error. "On Copilot, surprisingly enough, we found one of our own private repositories," said Dror. "If I was to browse the web, I wouldn't see this data. But anyone in the world could ask Copilot the right question and get this data."

After it realized that any data on GitHub, even briefly, could be potentially exposed by tools like Copilot, Lasso investigated further. Lasso extracted a list of repositories that were public at any point in 2024 and identified the repositories that had since been deleted or set to private. Using Bing's caching mechanism, the company found more than 20,000 since-private GitHub repositories still had data accessible through Copilot, affecting more than 16,000 organizations. Lasso told TechCrunch ahead of publishing its research that affected organizations include Amazon Web Services, Google, IBM, PayPal, Tencent, and Microsoft. [...] For some affected companies, Copilot could be prompted to return confidential GitHub archives that contain intellectual property, sensitive corporate data, access keys, and tokens, the company said.

AI

AI Reshapes Corporate Workforce as Companies Halt Traditional Hiring 119

Major corporations are reshaping their workforces around AI with Salesforce announcing it will not hire software engineers in 2025 and other companies laying off thousands while shifting focus to AI-specific roles. Duolingo has laid off thousands after implementing ChatGPT-4, UPS cut 4,000 jobs in its largest layoff in 116 years, and IBM paused hiring for back-office and HR positions that AI can now handle.

Amazon is redirecting staff from Alexa to AI areas, while Intuit is laying off 10% of its non-AI workforce. Cisco plans to cut 7% of employees in its second round of job cuts this year as it prioritizes AI and cybersecurity. Salesforce reports its AI platform is boosting software engineering productivity by 30%. SAP is restructuring 8,000 positions to focus on AI-driven business areas. The trend extends globally, with Microsoft relocating thousands during an "exodus" from China, while entry-level jobs on Wall Street are becoming obsolete.

A study found that 3 out of 10 companies replaced workers with AI last year, with over one-third of firms using AI likely to automate more roles in 2025. Job listings at large privately-held AI companies have dropped 14.2% over six months, JP Morgan wrote in a note seen by Slashdot. The transformation is creating new opportunities, with rising demand for AI skills in job postings. A survey of more than 1,200 users found nearly two-thirds of young professionals use AI tools at work, with 93% not worried about job threats, as business leaders view Generation Z's digital skills as beneficial for leveraging AI.
Operating Systems

ArcaOS (OS/2 Warp OEM) 5.1.1 Has Been Released (arcanoae.com) 88

"IBM stopped supporting OS/2 at the end of 2006," write the makers of ArcaOS, an OEM distribution of OS/2's discontinued Warp operating system.

And now long-time Slashdot reader martiniturbide tells us that ArcaOS 5.1.1 has been released, and that many of it's components have been updated too. From this week's announcement: ArcaOS 5.1.1 continues to support installation on the latest generation of UEFI-based systems, as well as the ability to install to GPT-based disk layouts. This enables ArcaOS 5.1.1 to install on a wide array of modern hardware. Of course, ArcaOS 5.1.1 is just as much at home on traditional BIOS-based systems, offering enhanced stability and performance across both environments....

Need more convincing? How about a commercial operating system which doesn't spy on you, does not report your online activity to anyone, and gives you complete freedom to choose the applications you want to use, however you want to use them? How about an operating system which isn't tied to any specific hardware manufacturer, allowing you to choose the platform which is right for you, and fits perfectly well in systems with less than 4GB of memory or even virtual machines?

Red Hat Software

Red Hat Plans to Add AI to Fedora and GNOME 49

In his post about the future of Fedora Workstation, Christian F.K. Schaller discusses how the Red Hat team plans to integrate AI with IBM's open-source Granite engine to enhance developer tools, such as IDEs, and create an AI-powered Code Assistant. He says the team is also working on streamlining AI acceleration in Toolbx and ensuring Fedora users have access to tools like RamaLama. From the post: One big item on our list for the year is looking at ways Fedora Workstation can make use of artificial intelligence. Thanks to IBMs Granite effort we know have an AI engine that is available under proper open source licensing terms and which can be extended for many different usecases. Also the IBM Granite team has an aggressive plan for releasing updated versions of Granite, incorporating new features of special interest to developers, like making Granite a great engine to power IDEs and similar tools. We been brainstorming various ideas in the team for how we can make use of AI to provide improved or new features to users of GNOME and Fedora Workstation. This includes making sure Fedora Workstation users have access to great tools like RamaLama, that we make sure setting up accelerated AI inside Toolbx is simple, that we offer a good Code Assistant based on Granite and that we come up with other cool integration points. "I'm still not sure how I feel about this approach," writes designer/developer and blogger, Bradley Taunt. "While IBM Granite is an open source model, I still don't enjoy so much artificial 'intelligence' creeping into core OS development. This also isn't something optional on the end-users side, like a desktop feature or package. This sounds like it's going to be built directly into the core system."

"Red Hat has been pushing hard towards AI and my main concern is having this influence other operating system dev teams. Luckily things seems AI-free in BSD land. For now, at least."
Supercomputing

Quantum Computer Built On Server Racks Paves the Way To Bigger Machines (technologyreview.com) 27

An anonymous reader quotes a report from MIT Technology Review: A Canadian startup called Xanadu has built a new quantum computer it says can be easily scaled up to achieve the computational power needed to tackle scientific challenges ranging from drug discovery to more energy-efficient machine learning. Aurora is a "photonic" quantum computer, which means it crunches numbers using photonic qubits -- information encoded in light. In practice, this means combining and recombining laser beams on multiple chips using lenses, fibers, and other optics according to an algorithm. Xanadu's computer is designed in such a way that the answer to an algorithm it executes corresponds to the final number of photons in each laser beam. This approach differs from one used by Google and IBM, which involves encoding information in properties of superconducting circuits.

Aurora has a modular design that consists of four similar units, each installed in a standard server rack that is slightly taller and wider than the average human. To make a useful quantum computer, "you copy and paste a thousand of these things and network them together," says Christian Weedbrook, the CEO and founder of the company. Ultimately, Xanadu envisions a quantum computer as a specialized data center, consisting of rows upon rows of these servers. This contrasts with the industry's earlier conception of a specialized chip within a supercomputer, much like a GPU. [...]

Xanadu's 12 qubits may seem like a paltry number next to IBM's 1,121, but Tiwari says this doesn't mean that quantum computers based on photonics are running behind. In his opinion, the number of qubits reflects the amount of investment more than it does the technology's promise. [...] Xanadu's next goal is to improve the quality of the photons in the computer, which will ease the error correction requirements. "When you send lasers through a medium, whether it's free space, chips, or fiber optics, not all the information makes it from the start to the finish," he says. "So you're actually losing light and therefore losing information." The company is working to reduce this loss, which means fewer errors in the first place. Xanadu aims to build a quantum data center, with thousands of servers containing a million qubits, in 2029.
The company published its work on chip design optimization and fabrication in the journal Nature.
Oracle

Oracle Faces Java Customer Revolt After 'Predatory' Pricing Changes (theregister.com) 136

Nearly 90% of Oracle Java customers are looking to abandon the software maker's products following controversial licensing changes made in 2023, according to research firm Dimensional Research.

The exodus reflects growing frustration with Oracle's shift to per-employee pricing for its Java platform, which critics called "predatory" and could increase costs up to five times for the same software, Gartner found. The dissatisfaction runs deepest in Europe, where 92% of French and 95% of German users want to switch to alternative providers like Bellsoft Liberica, IBM Semeru, or Azul Platform Core.
Games

Complexity Physics Finds Crucial Tipping Points In Chess Games (arstechnica.com) 12

An anonymous reader quotes a report from Ars Technica: The game of chess has long been central to computer science and AI-related research, most notably in IBM's Deep Blue in the 1990s and, more recently, AlphaZero. But the game is about more than algorithms, according to Marc Barthelemy, a physicist at the Paris-Saclay University in France, with layers of depth arising from the psychological complexity conferred by player strategies. Now, Barthelmey has taken things one step further by publishing a new paper in the journal Physical Review E that treats chess as a complex system, producing a handy metric that can help predict the proverbial "tipping points" in chess matches. [...]

For his analysis, Barthelemy chose to represent chess as a decision tree in which each "branch" leads to a win, loss, or draw. Players face the challenge of finding the best move amid all this complexity, particularly midgame, in order to steer gameplay into favorable branches. That's where those crucial tipping points come into play. Such positions are inherently unstable, which is why even a small mistake can have a dramatic influence on a match's trajectory. Barthelemy has re-imagined a chess match as a network of forces in which pieces act as the network's nodes, and the ways they interact represent the edges, using an interaction graph to capture how different pieces attack and defend one another. The most important chess pieces are those that interact with many other pieces in a given match, which he calculated by measuring how frequently a node lies on the shortest path between all the node pairs in the network (its "betweenness centrality").

He also calculated so-called "fragility scores," which indicate how easy it is to remove those critical chess pieces from the board. And he was able to apply this analysis to more than 20,000 actual chess matches played by the world's top players over the last 200 years. Barthelemy found that his metric could indeed identify tipping points in specific matches. Furthermore, when he averaged his analysis over a large number of games, an unexpected universal pattern emerged. "We observe a surprising universality: the average fragility score is the same for all players and for all openings," Barthelemy writes. And in famous chess matches, "the maximum fragility often coincides with pivotal moments, characterized by brilliant moves that decisively shift the balance of the game." Specifically, fragility scores start to increase about eight moves before the critical tipping point position occurs and stay high for some 15 moves after that.
"These results suggest that positional fragility follows a common trajectory, with tension peaking in the middle game and dissipating toward the endgame," writes Barthelemy. "This analysis highlights the complex dynamics of chess, where the interaction between attack and defense shapes the game's overall structure."
Operating Systems

How the OS/2 Flop Went On To Shape Modern Software (theregister.com) 167

"It's fair to say that by 1995, OS/2 was dead software walking," remembers a new article from the Register (which begins with a 1995 Usenet post from Gordon Letwin, Microsoft's lead architect on the OS/2 project).

But the real question is why this Microsoft-IBM collaboration on a DOS-replacing operating system ultimately lost out to Windows...? If OS/2 1.0 had been an 80386 OS, and had been able to multitask DOS apps, we think it would have been a big hit.... OS/2's initial 1980s versions were 16-bit products, at IBM's insistence. That is when the war was lost. That is when OS/2 flopped. Because its initial versions were even more crippled than the Deskpro 386...

Because OS/2 1.x flopped, Microsoft launched a product that fixed the key weakness of OS/2 1.x. That product was Windows 3, which worked perfectly acceptably on 286 machines, but if you ran the same installed copy on a 32-bit 386 PC, it worked better. Windows 3.0 could use the more sophisticated hardware of a 386 to give better multitasking of the market-dominating DOS apps...

IBM's poor planning shaped the PC industry of the 1990s more than Microsoft's successes. Windows 3.0 wasn't great, but it was good enough. It reversed people's perception of Windows after the failures of Windows 1 and Windows 2. Windows 3 achieved what OS/2 had intended to do. It transformed IBM PC compatibles from single-tasking text-only computers into graphical computers, with poor but just about usable multitasking...

Soon after Windows 3.0 turned out to be a hit, OS/2 NT was rebranded as Windows NT. Even the most ardent Linux enthusiast must c\oncede that Windows NT did quite well over three decades.

Back in 1995, the Register's author says they'd moved from OS/2 to Windows 95 "while it was still in beta.

"The UI was far superior, more hardware worked, and Doom ran much better."
IBM

IBM and GlobalFoundries Settle Multibillion-Dollar Trade Secret and Contract Lawsuits (theregister.com) 3

The Register's Jude Karabus reports: IBM and semiconductor maker GlobalFoundries have settled all of their litigation against each other, including breach of contract, patent, and trade secret suits, the pair say. The details of the settlement are confidential. All that both companies were prepared to say in yesterday's statements was that the deal they'd agreed would resolve "all litigation matters, inclusive of breach of contract, trade secrets, and intellectual property claims between the two companies." They added that the settlement would allow the companies to "explore new opportunities for collaboration in areas of mutual interest." In 2021, IBM sued GlobalFoundries for $2.5 billion, accusing it of failing to deliver on 10nm and 7nm chip production commitments, which disrupted IBM's hardware roadmap. GlobalFoundries poaching engineers countersued in 2023, alleging IBM misused trade secrets and poached engineers to support partnerships with Intel and Rapidus, potentially compromising proprietary technologies.
Printer

Xerox To Buy Printer Maker Lexmark From Chinese Owners in $1.5 Billion Deal (xerox.com) 30

Xerox has agreed to acquire printer maker Lexmark for $1.5 billion, bringing the Kentucky-based company back under U.S. ownership after seven years of Chinese control.

The deal, announced Monday, will be financed through cash and debt, creating a vertically integrated printing equipment manufacturer and service provider. Lexmark, formed from IBM in 1991, was previously acquired by Chinese investors including Ninestar for $2.54 billion in 2016. The merger comes as Xerox faces declining equipment sales and a 50% year-to-date stock drop, with its market value at just over $1 billion.
Google

Scientific Breakthrough Gives New Hope To Building Quantum Computers (ft.com) 83

Google has achieved a major breakthrough in quantum error correction that could enable practical quantum computers by 2030, the company announced in a paper published Monday in Nature. The research demonstrated significant error reduction when scaling up from 3x3 to 7x7 grids of quantum bits, with errors dropping by half at each step. The advance addresses quantum computing's core challenge of maintaining stable quantum states, which typically last only microseconds.

Google's new quantum chip, manufactured in-house, maintains quantum states for nearly 100 microseconds -- five times longer than previous versions. The company aims to build a full-scale system with about 1 million qubits, projecting costs around $1 billion by decade's end.

IBM, Google's main rival, questioned the scalability of Google's "surface code" error correction approach, claiming it would require billions of qubits. IBM is pursuing an alternative three-dimensional design requiring new connector technology expected by 2026. The breakthrough parallels the first controlled nuclear chain reaction in 1942, according to MIT physics professor William Oliver, who noted that both achievements required years of engineering to realize theoretical predictions from decades earlier.

Further reading: Google: Meet Willow, our state-of-the-art quantum chip.
Programming

Stanford Research Reveals 9.5% of Software Engineers 'Do Virtually Nothing' (x.com) 237

A Stanford study of over 50,000 software engineers across hundreds of companies has found that approximately 9.5% of engineers perform minimal work while drawing full salaries, potentially costing tech companies billions annually.

The research showed the issue is most prevalent in remote work settings, where 14% of engineers were classified as "ghost engineers" compared to 6% of office-based staff. The study evaluated productivity through analysis of private Git repositories and simulated expert assessments of code commits.

Major tech companies could be significantly impacted, with IBM estimated to have 17,100 underperforming engineers at an annual cost of $2.5 billion. Across the global software industry, the researchers estimate the total cost of underperforming engineers could reach $90 billion, based on a conservative 6.5% rate of "ghost engineers" worldwide.

Slashdot Top Deals