×
Bug

Patched Windows Bug Was Actually a Dangerous Wormable Code-Execution Vulnerability (arstechnica.com) 20

Ars Technica reports on a dangerously "wormable" Windows vulnerability that allowed attackers to execute malicious code with no authentication required — a vulnerability that was present "in a much broader range of network protocols, giving attackers more flexibility than they had when exploiting the older vulnerability." Microsoft fixed CVE-2022-37958 in September during its monthly Patch Tuesday rollout of security fixes. At the time, however, Microsoft researchers believed the vulnerability allowed only the disclosure of potentially sensitive information. As such, Microsoft gave the vulnerability a designation of "important." In the routine course of analyzing vulnerabilities after they're patched, IBM security researcher Valentina Palmiotti discovered it allowed for remote code execution in much the way EternalBlue did [the flaw used to detonate WannaCry]. Last week, Microsoft revised the designation to critical and gave it a severity rating of 8.1, the same given to EternalBlue....

One potentially mitigating factor is that a patch for CVE-2022-37958 has been available for three months. EternalBlue, by contrast, was initially exploited by the NSA as a zero-day. The NSA's highly weaponized exploit was then released into the wild by a mysterious group calling itself Shadow Brokers. The leak, one of the worst in the history of the NSA, gave hackers around the world access to a potent nation-state-grade exploit. Palmiotti said there's reason for optimism but also for risk: "While EternalBlue was an 0-Day, luckily this is an N-Day with a 3 month patching lead time," said Palmiotti.

There's still some risk, Palmiotti tells Ars Technica. "As we've seen with other major vulnerabilities over the years, such as MS17-010 which was exploited with EternalBlue, some organizations have been slow deploying patches for several months or lack an accurate inventory of systems exposed to the internet and miss patching systems altogether."

Thanks to Slashdot reader joshuark for sharing the article.
Technology

Who Really Invented the Thumb Drive? (ieee.org) 134

IEEE Spectrum: In 2000, at a trade fair in Germany, an obscure Singapore company called Trek 2000 unveiled a solid-state memory chip encased in plastic and attached to a Universal Serial Bus (USB) connector. The gadget, roughly the size of a pack of chewing gum, held 8 megabytes of data and required no external power source, drawing power directly from a computer when connected. It was called the ThumbDrive. That device, now known by a variety of names -- including memory stick, USB stick, flash drive, as well as thumb drive -- changed the way computer files are stored and transferred. Today it is familiar worldwide. The thumb drive was an instant hit, garnering hundreds of orders for samples within hours. Later that year, Trek went public on the Singapore stock exchange, and in four months -- from April through July 2000 -- it manufactured and sold more than 100,000 ThumbDrives under its own label.

Before the invention of the thumb drive, computer users stored and transported their files using floppy disks. Developed by IBM in the 1960s, first 8-inch and later 5 1/4-inch and 3 1/2-inch floppy disks replaced cassette tapes as the most practical portable storage media. Floppy disks were limited by their relatively small storage capacity -- even double-sided, double-density disks could store only 1.44 MB of data. During the 1990s, as the size of files and software increased, computer companies searched for alternatives. Personal computers in the late 1980s began incorporating CD-ROM drives, but initially these could read only from prerecorded disks and could not store user-generated data. The Iomega Zip Drive, called a "superfloppy" drive and introduced in 1994, could store up to 750 MB of data and was writable, but it never gained widespread popularity, partly due to competition from cheaper and higher-capacity hard drives.

Computer users badly needed a cheap, high-capacity, reliable, portable storage device. The thumb drive was all that -- and more. It was small enough to slip in a front pocket or hang from a keychain, and durable enough to be rattled around in a drawer or tote without damage. With all these advantages, it effectively ended the era of the floppy disk. But Trek 2000 hardly became a household name. And the inventor of the thumb drive and Trek's CEO, Henn Tan, did not become as famous as other hardware pioneers like Robert Noyce, Douglas Engelbart, or Steve Jobs. Even in his home of Singapore, few people know of Tan or Trek. Why aren't they more famous? After all, mainstream companies including IBM, TEAC, Toshiba, and, ultimately, Verbatim licensed Trek's technology for their own memory stick devices. And a host of other companies just copied Tan without permission or acknowledgment.

IBM

IBM To Create 24-Core Power Chip So Customers Can Exploit Oracle Database License (theregister.com) 70

IBM has quietly announced it's planning a 24-core Power 10 processor, seemingly to make one of its servers capable of running Oracle's database in a cost-effective fashion. From a report: A hardware announcement dated December 13 revealed the chip in the following "statement of general direction" about Big Blue's Power S1014 technology-based server: "IBM intends to announce a high-density 24-core processor for the IBM Power S1014 system (MTM 9105-41B) to address application environments utilizing an Oracle Database with the Standard Edition 2 (SE2) licensing model. It intends to combine a robust compute throughput with the superior reliability and availability features of the IBM Power platform while complying with Oracle Database SE2 licensing guidelines."
Unix

OSnews Decries 'The Mass Extinction of Unix Workstations' (osnews.com) 284

Anyone remember the high-end commercial UNIX workstations from a few decades ago — like from companies like IBM, DEC, SGI, and Sun Microsystems?

Today OSnews looked back — but also explored what happens when you try to buy one today> : As x86 became ever more powerful and versatile, and with the rise of Linux as a capable UNIX replacement and the adoption of the NT-based versions of Windows, the days of the UNIX workstations were numbered. A few years into the new millennium, virtually all traditional UNIX vendors had ended production of their workstations and in some cases even their associated architectures, with a lacklustre collective effort to move over to Intel's Itanium — which didn't exactly go anywhere and is now nothing more than a sour footnote in computing history.

Approaching roughly 2010, all the UNIX workstations had disappeared.... and by now, they're all pretty much dead (save for Solaris). Users and industries moved on to x86 on the hardware side, and Linux, Windows, and in some cases, Mac OS X on the software side.... Over the past few years, I have come to learn that If you want to get into buying, using, and learning from UNIX workstations today, you'll run into various problems which can roughly be filed into three main categories: hardware availability, operating system availability, and third party software availability.

Their article details their own attempts to buy one over the years, ultimately concluding the experience "left me bitter and frustrated that so much knowledge — in the form of documentation, software, tutorials, drivers, and so on — is disappearing before our very eyes." Shortsightedness and disinterest in their own heritage by corporations, big and small, is destroying entire swaths of software, and as more years pass by, it will get ever harder to get any of these things back up and running.... As for all the third-party software — well, I'm afraid it's too late for that already. Chasing down the rightsholders is already an incredibly difficult task, and even if you do find them, they are probably not interested in helping you, and even if by some miracle they are, they most likely no longer even have the ability to generate the required licenses or release versions with the licensing ripped out. Stuff like Pro/ENGINEER and SoftWindows for UNIX are most likely gone forever....

Software is dying off at an alarming rate, and I fear there's no turning the tide of this mass extinction.

The article also wonders why companies like HPE don't just "dump some ISO files" onto an FTP server, along with patch depots and documentation. "This stuff has no commercial value, they're not losing any sales, and it will barely affect their bottom line.
Science

Physicists Use Google's Quantum Computer to Create Holographic Wormhole Between Black Holes (quantamagazine.org) 55

"In an experiment that ticks most of the mystery boxes in modern physics, a group of researchers announced Wednesday that they had simulated a pair of black holes in a quantum computer," reports the New York Times [alternate URL here. But in addition, the researchers also sent a message between their two black holes, the Times reports, "through a shortcut in space-time called a wormhole.

"Physicists described the achievement as another small step in the effort to understand the relation between gravity, which shapes the universe, and quantum mechanics, which governs the subatomic realm of particles....

Quanta magazine reports: The wormhole emerged like a hologram out of quantum bits of information, or "qubits," stored in tiny superconducting circuits. By manipulating the qubits, the physicists then sent information through the wormhole, they reported Wednesday in the journal Nature. The team, led by Maria Spiropulu of the California Institute of Technology, implemented the novel "wormhole teleportation protocol" using Google's quantum computer, a device called Sycamore housed at Google Quantum AI in Santa Barbara, California. With this first-of-its-kind "quantum gravity experiment on a chip," as Spiropulu described it, she and her team beat a competing group of physicists who aim to do wormhole teleportation with IBM and Quantinuum's quantum computers.

When Spiropulu saw the key signature indicating that qubits were passing through the wormhole, she said, "I was shaken."

The experiment can be seen as evidence for the holographic principle, a sweeping hypothesis about how the two pillars of fundamental physics, quantum mechanics and general relativity, fit together.... The holographic principle, ascendant since the 1990s, posits a mathematical equivalence or "duality" between the two frameworks. It says the bendy space-time continuum described by general relativity is really a quantum system of particles in disguise. Space-time and gravity emerge from quantum effects much as a 3D hologram projects out of a 2D pattern. Indeed, the new experiment confirms that quantum effects, of the type that we can control in a quantum computer, can give rise to a phenomenon that we expect to see in relativity — a wormhole....

To be clear, unlike an ordinary hologram, the wormhole isn't something we can see. While it can be considered "a filament of real space-time," according to co-author Daniel Jafferis of Harvard University, lead developer of the wormhole teleportation protocol, it's not part of the same reality that we and the Sycamore computer inhabit. The holographic principle says that the two realities — the one with the wormhole and the one with the qubits — are alternate versions of the same physics, but how to conceptualize this kind of duality remains mysterious. Opinions will differ about the fundamental implications of the result. Crucially, the holographic wormhole in the experiment consists of a different kind of space-time than the space-time of our own universe. It's debatable whether the experiment furthers the hypothesis that the space-time we inhabit is also holographic, patterned by quantum bits.

"I think it is true that gravity in our universe is emergent from some quantum [bits] in the same way that this little baby one-dimensional wormhole is emergent" from the Sycamore chip, Jafferis said. "Of course we don't know that for sure. We're trying to understand it."

Here's how principal investigator Spiropulu summarizes their experiment. "We found a quantum system that exhibits key properties of a gravitational wormhole yet is sufficiently small to implement on today's quantum hardware."
Hardware

PCI Standards Group Deflects, Assigns Blame for Melting GPU Power Connectors (arstechnica.com) 130

An anonymous reader shares a report: Nvidia's new RTX 4090 and 4080 GPUs both use a new connector called 12VHPWR to deliver power as a way to satisfy ever-more power-hungry graphics cards without needing to set aside the physical space required for three or four 8-pin power connectors. But that power connector and its specifications weren't created by Nvidia alone -- to ensure interoperability, the spec was developed jointly by the PCI Express Special Interest Group (PCI-SIG), a body that includes Nvidia, AMD, Intel, Arm, IBM, Qualcomm, and others.

But the overheating and melting issues experienced by some RTX 4090 owners recently have apparently prompted the PCI-SIG to clarify exactly which parts of the spec it is and is not responsible for. In a statement reported by Tom's Hardware, the group sent its members a reminder that they, not the PCI-SIG, were responsible for safety testing products using connector specs like 12VHPWR. "Members are reminded that PCI-SIG specifications provide necessary technical information for interoperability and do not attempt to address proper design, manufacturing methods, materials, safety testing, safety tolerances, or workmanship," the statement reads. "When implementing a PCI-SIG specification, Members are responsible for the design, manufacturing, and testing, including safety testing, of their products."

IBM

IBM and Maersk Abandon Ship on TradeLens Logistics Blockchain (coindesk.com) 28

Maersk and IBM will wind down their shipping blockchain TradeLens by early 2023, ending the pair's five-year project to improve global trade by connecting supply chains on a permissioned blockchain. From a report: TradeLens emerged during the "enterprise blockchain" era of 2018 as a high-flying effort to make inter-corporate trade more efficient. Open to shipping and freight operators, its members could validate the transaction of goods as recorded on a transparent digital ledger.

The idea was to save its member-shipping companies money by connecting their world. But the network was only as strong as its participants; despite some early wins, TradeLens ultimately failed to catch on with a critical mass of its target industry. "TradeLens has not reached the level of commercial viability necessary to continue work and meet the financial expectations as an independent business," Maersk Head of Business Platforms Rotem Hershko said in a statement.

Software

Frederick P. Brooks Jr., Computer Design Innovator, Dies at 91 16

Frederick P. Brooks Jr., whose innovative work in computer design and software engineering helped shape the field of computer science, died on Thursday at his home in Chapel Hill, N.C. He was 91. His death was confirmed by his son, Roger, who said Dr. Brooks had been in declining health since having a stroke two years ago. The New York Times reports: Dr. Brooks had a wide-ranging career that included creating the computer science department at the University of North Carolina and leading influential research in computer graphics and virtual reality. But he is best known for being one of the technical leaders of IBM's 360 computer project in the 1960s. At a time when smaller rivals like Burroughs, Univac and NCR were making inroads, it was a hugely ambitious undertaking. Fortune magazine, in an article with the headline "IBM's $5,000,000,000 Gamble," described it as a "bet the company" venture.

Until the 360, each model of computer had its own bespoke hardware design. That required engineers to overhaul their software programs to run on every new machine that was introduced. But IBM promised to eliminate that costly, repetitive labor with an approach championed by Dr. Brooks, a young engineering star at the company, and a few colleagues. In April 1964, IBM announced the 360 as a family of six compatible computers. Programs written for one 360 model could run on the others, without the need to rewrite software, as customers moved from smaller to larger computers. The shared design across several machines was described in a paper, written by Dr. Brooks and his colleagues Gene Amdahl and Gerrit Blaauw, titled "Architecture of the IBM System/360." "That was a breakthrough in computer architecture that Fred Brooks led," Richard Sites, a computer designer who studied under Dr. Brooks, said in an interview.

But there was a problem. The software needed to deliver on the IBM promise of compatibility across machines and the capability to run multiple programs at once was not ready, as it proved to be a far more daunting challenge than anticipated. Operating system software is often described as the command and control system of a computer. The OS/360 was a forerunner of Microsoft's Windows, Apple's iOS and Google's Android. At the time IBM made the 360 announcement, Dr. Brooks was just 33 and headed for academia. He had agreed to return to North Carolina, where he grew up, and start a computer science department at Chapel Hill. But Thomas Watson Jr., the president of IBM, asked him to stay on for another year to tackle the company's software troubles. Dr. Brooks agreed, and eventually the OS/360 problems were sorted out. The 360 project turned out to be an enormous success, cementing the company's dominance of the computer market into the 1980s.
"Fred Brooks was a brilliant scientist who changed computing," Arvind Krishna, IBM's chief executive and himself a computer scientist, said in a statement. "We are indebted to him for his pioneering contributions to the industry."

Dr. Brooks published a book in 1975 titled, "The Mythical Man-Month: Essays on Software Engineering." It was "a quirky classic, selling briskly year after year and routinely cited as gospel by computer scientists," reports the Times.
IBM

IBM Sues Micro Focus, Claims It Copied Big Blue Mainframe Software (theregister.com) 43

IBM has filed a lawsuit against Micro Focus, alleging the enterprise software company copied and reverse-engineered its CICS mainframe service to develop a rival product, the Micro Focus Enterprise Server. From a report: Big Blue has brought the case in the US District Court in New York, citing violation of copyright law and claiming that Micro Focus was in "blatant breach" of its contractual obligations with IBM. In a strongly worded complaint, the company accused UK-based Micro Focus of "brazen theft" of IBM software and said the suit was filed to "protect [its] valuable intellectual property." IBM is seeking compensation as well as an injunction against Micro Focus that would prohibit the company from distributing the products Big Blue labels as "derivative works" it claims are based upon IBM's own computer software.
Businesses

Is Quantum Computing Moving from Theoretical to Startups? (msn.com) 38

The Boston Globe reports that "More money is starting to flow into the nascent field of quantum computing in Boston, turning academic research at MIT and Harvard labs into startups."

In September, Northeastern University announced it will build a $10 million lab at its Burlington campus to explore applications for quantum technology, and to train students to work with it. And companies based in other countries are setting up outposts here to hire quantum-savvy techies....

"It's still pretty early" for quantum computing, says Russ Wilcox, a partner at the venture capital firm Pillar. "But a number of companies are starting to experiment to learn how to make use of it. The key factor is that the field is progressing at an exponential rate." In 2018, his firm made an early investment in Zapata Computing, a Boston startup building software for quantum computers and selling services — including ways to analyze the new cybersecurity risks that a powerful new class of computers could introduce....

In the current fiscal year, the federal government budgeted about $900 million to advance the field of quantum information science, which includes quantum computing....

[S]everal local venture capital firms are getting comfortable with placing bets on the quantum computing sector. Glasswing's Rudina Seseri says that her firm is "seeing momentum pick up," although the sector is "still in the warm-up phase, not yet in the first inning." But some of the technology being developed by startups, she says, "is so meaningful that if they get the technology to work at scale, they will be incredibly valuable."

That said, much of the revenue available to these companies today comes from researchers in academic and corporate labs trying to understand the potential of quantum computers. Sam Liss, an executive director in Harvard's Office of Technology Development, thinks that "the large commercial opportunities for quantum are still a long way off." The OTD helps attract corporate funding to Harvard research labs, and also helps to license technologies created in those labs to the private sector. "Technologies have a way of getting oversold and overhyped," Liss says. "We all recognize that this is going to take some time."

Large companies like Amazon, Google, and IBM are trying to move the field forward, and startups are beginning to demonstrate their new approaches. In the startup realm, Liss says, we're seeing enough new companies being formed and attracting funding "to support a thesis that this will be a big thing."

Supercomputing

IBM Unveils Its 433 Qubit Osprey Quantum Computer (techcrunch.com) 29

An anonymous reader quotes a report from TechCrunch: IBM wants to scale up its quantum computers to over 4,000 qubits by 2025 -- but we're not quite there yet. For now, we have to make do with significantly smaller systems and today, IBM announced the launch of its Osprey quantum processor, which features 433 qubits, up from the 127 qubits of its 2021 Eagle processor. And with that, the slow but steady march toward a quantum processor with real-world applications continues.

IBM's quantum roadmap includes two additional stages -- the 1,121-qubit Condor and 1,386-qubit Flamingo processors in 2023 and 2024 -- before it plans to hit the 4,000-qubit stage with its Kookaburra processor in 2025. So far, the company has generally been able to make this roadmap work, but the number of qubits in a quantum processor is obviously only one part of a very large and complex puzzle, with longer coherence times and reduced noise being just as important.

The company also today detailed (Link: YouTube) its Quantum System Two -- basically IBM's quantum mainframe -- which will be able to house multiple quantum processors and integrate them into a single system with high-speed communication links. The idea here is to launch this system by the end of 2023.
"The new 433 qubit 'Osprey' processor brings us a step closer to the point where quantum computers will be used to tackle previously unsolvable problems," said Dario Gil, senior vice president, IBM and director of Research. "We are continuously scaling up and advancing our quantum technology across hardware, software and classical integration to meet the biggest challenges of our time, in conjunction with our partners and clients worldwide. This work will prove foundational for the coming era of quantum-centric supercomputing."

Further reading: IBM Held Talks With Biden Administration on Quantum Controls
IBM

IBM Held Talks With Biden Administration on Quantum Controls (bloomberg.com) 17

IBM has engaged in talks with the Biden administration on potential export controls for quantum computers as the company continues investing in the emerging technology. From a report: IBM recommended that any regulations, if developed, cover potentially problematic uses of quantum computing rather than limiting the technology based simply on processing power, said Dario Gil, head of IBM Research. Quantum technology will likely be subject to constraints like export controls, Gil said. "We will continue to be an active participant in that dialogue," he said.

Quantum computing is an experimental field with the potential to accelerate processing power and upend current cybersecurity standards. The Biden administration is exploring the possibility of new export controls that would limit China's access to quantum along with other powerful emerging technologies, Bloomberg News reported last month. IBM has installed quantum infrastructure in countries like Germany and Japan, but not China, Gil said. Big Blue has invested millions in the field, and is unveiling a new quantum processor this week that is more than three times more powerful, measured by qubits, than its version announced last year.

Open Source

Google Announces GUAC Open-Source Project On Software Supply Chains (therecord.media) 2

Google unveiled a new open source security project on Thursday centered around software supply chain management. The Record reports: Given the acronym GUAC -- which stands for Graph for Understanding Artifact Composition -- the project is focused on creating sets of data about a software's build, security and dependency. Google worked with Purdue University, Citibank and supply chain security company Kusari on GUAC, a free tool built to bring together many different sources of software security metadata. Google has also assembled a group of technical advisory members to help with the project -- including IBM, Intel, Anchore and more.

Google's Brandon Lum, Mihai Maruseac, Isaac Hepworth pitched the effort as one way to help address the explosion in software supply chain attacks -- most notably the widespread Log4j vulnerability that is still leaving organizations across the world exposed to attacks. "GUAC addresses a need created by the burgeoning efforts across the ecosystem to generate software build, security, and dependency metadata," they wrote in a blog post. "GUAC is meant to democratize the availability of this security information by making it freely accessible and useful for every organization, not just those with enterprise-scale security and IT funding."

Google shared a proof of concept of the project, which allows users to search data sets of software metadata. The three explained that GUAC effectively aggregates software security metadata into a database and makes it searchable. They used the example of a CISO or compliance officer that needs to understand the "blast radius" of a vulnerability. GUAC would allow them to "trace the relationship between a component and everything else in the portfolio." Google says the tool will allow anyone to figure out the most used critical components in their software supply chain ecosystem, the security weak points and any risky dependencies. As the project evolves, Maruseac, Lum and Hepworth said the next part of the work will center around scaling the project and adding new kinds of documents that can be submitted and ingested by the system.

Amiga

Ask Slashdot: What Was Your First Computer? 523

Long-time Slashdot reader destinyland writes: Today GitHub's official Twitter account asked the ultimate geek-friendly question. "You never forget your first computer. What was yours?"

And within 10 hours they'd gotten 2,700 responses.

Commodore 64, TRS-80, Atari 800, Compaq Presario... People posted names you haven't heard in years, like they were sharing memories of old friends. Gateway 2000, Sony VAIO, Vic-20, Packard Bell... One person just remembered they'd had "some sort of PC that had an orange and black screen with text and QBasic. It couldn't do much more than store recipes and play text based games."

And other memories started to flow. ("Jammed on Commander Keen & Island of Dr. Brain..." "Dammit that Doom game was amazing, can't forget Oregon Trail...")

Sharp PC-4500, Toshiba T3200, Timex Sinclair 1000, NEC PC-8801. Another's first computer was "A really really old HP laptop that has a broken battery!"

My first computer was an IBM PS/2. It had a 2400 baud internal modem. Though in those long-ago days before local internet services, it was really only good for dialing up BBS's. I played chess against a program on a floppy disk that I got from a guy from work.

Can you still remember yours? Share your best memories in the comments.

What was your first computer?
China

Star American Professor Masterminded a Surveillance Machine For Chinese Big Tech (thedailybeast.com) 26

An anonymous reader quotes a report from The Daily Beast: A star University of Maryland (UMD) professor built a machine-learning software "useful for surveillance" as part of a six-figure research grant from Chinese tech giant Alibaba, raising concerns that an American public university directly contributed to China's surveillance state. Alibaba provided $125,000 in funding to a research team led by Dinesh Manocha, a professor of computer science at UMD College Park, to develop an urban surveillance software that can "classify the personality of each pedestrian and identify other biometric features," according to research grant documents obtained via public records request. "These capabilities will be used to predict the behavior of each pedestrian and are useful for surveillance," the document read.

Manocha is a decorated scholar in the AI and robotics field who has earned awards and accolades from Google, IBM, and many others. His star status brings rewards: Maryland taxpayers paid $355,000 in salaries to the professor in 2021, according to government watchdog Open the Books. The U.S. military also provides lavish funding for the professor's research, signing a $68 million agreement with Manocha's lab to research military applications of AI technologies. But Maryland taxpayers and the U.S. military are not the only ones funding Manocha's research. In January 2018, the University of Maryland and Alibaba signed an 18-month research contract funding Manocha's research team. In the grant document obtained by The Daily Beast, Manocha's team pledged to "work closely with Alibaba researchers" to develop an urban surveillance software that can identify pedestrians based on their unique gait signatures. The algorithm would then use the gait signatures to classify pedestrians as "aggressive," "shy," "impulsive," and other personalities. The grant required UMD researchers to test the algorithm on videos provided by Alibaba and present their findings in person at Alibaba labs in China. The scholars also had to provide the C++ codebase for the software and the raw dataset as deliverables to Alibaba. The software's "clear implication is to proactively predict demonstrations and protests so that they might be quelled," Fedasiuk told The Daily Beast. "Given what we know now about China's architecture of repression in Xinjiang and other regions, it is clear Dr. Manocha should not have pitched this project, and administrators at UMD should not have signed off on it."

It's not just Alibaba that was interested in the professor's expertise. In January 2019 -- back when the Alibaba grant was still active -- Manocha secured a taxpayer-funded, $321,000 Defense Department grant for his research team. The two grants funded very similar research projects. The Alibaba award was titled "large-scale behavioral learning for dense crowds." Meanwhile, the DoD grant funded research into "efficient computational models for simulating large-scale heterogeneous crowds." Unsurprisingly, the research outputs produced by the two grants had significant overlap. Between 2019 and 2021, Manocha published multiple articles in the AI and machine-learning field that cited both the Alibaba and DoD grant. There is no evidence that Manocha broke the law by double-dipping from U.S. and Chinese funding sources to fund similar research projects. Nevertheless, the case still raises "serious questions about ethics in machine learning research," Fedasiuk said.

Encryption

Semiconductor Makers Scramble to Support New Post-Quantum Cryptography Standard (eetimes.com) 40

IoT Times brings an update on "the race to create a new set of encryption standards." Last month, it was announced that a specialized security algorithm co-authored by security experts of NXP, IBM, and Arm had been selected by the U.S. Government's National Institute of Standards and Technology (NIST) to become part of an industry global standard designed to counter quantum threats.
IoT Times interviews the cryptography expert who co-created the Crystals-Kyber lattice-based algorithm selected by NIST — Joppe W. Bos, a senior principal cryptographer at the Competence Center for Cryptography and Security at NXP Semiconductors.

And what worries his colleagues at the semiconductor company isn't the "imminent threat of quantum computers," Bos says, but an even closer and more practical deadline: "the timeline for these post-quantum crypto standards." "Two weeks ago, NIST announced the winners of these new public standards, the post-quantum crypto standards, and their timeline is that in 2024, so in roughly two years, the winners will be converted into standards. And as soon as the standards are released, our customers will expect NXP Semiconductors, as one of the leaders in crypto and security, to already have support for these standards, because we are, of course, at the start of the chain for many end products. Our secure elements, our secure platforms, SOCs, are one of the first things that need to be integrated into larger platforms that go into end products. Think about industrial IoT. Think about automotive applications. So, our customers already expect us to support post-quantum crypto standards in 2024, and not only support but, for many companies, being able to compute the functional requirements of the standard.

"It took over ten years to settle down on the best methods for RSA and ECC, and now we have a much shorter timeline to get ready for post-quantum crypto."

"When you ask the experts, it ranges from one to five decades until we can see quantum computers big enough to break our current crypto," Bos says in the interview. So he stresses that they're not driven by a few of quantum computers. "The right question to ask, at least for us at NXP is, when is this new post-quantum crypto standard available? Because then, our customers will ask for post-quantum support, and we need to be ready.

"The standard really drives our development and defines our roadmap."

But speaking of the standard's "functional requirements", in the original story submission Slashdot reader dkatana raised an interesting point. There's already billions of low-powered IoT devices in the world.

Will they all have the memory and processing power to use this new lattice-based encryption?

Open Source

How W4 Plans To Monetize the Godot Game Engine Using Red Hat's Open Source Playbook (techcrunch.com) 8

An anonymous reader quotes a report from TechCrunch: A new company from the creators of the Godot game engine is setting out to grab a piece of the $200 billion global video game market -- and to do so, it's taking a cue from commercial open source software giant Red Hat. Godot, for the uninitiated, is a cross-platform game engine first released under an open source license back in 2014, though its initial development pre-dates that by several years. Today, Godot claims some 1,500 contributors, and is considered one of the world's top open source projects by various metrics. Godot has been used in high-profile games such as the Sonic Colors: Ultimate remaster, published by Sega last year as the first major mainstream game powered by Godot. But Tesla, too, has apparently used Godot to power some of the more graphically intensive animations in its mobile app.

Among Godot's founding creators is Juan Linietsky, who has served as head of development for the Godot project for the past 13 years, and who will now serve as CEO of W4 Games, a new venture that's setting out to take Godot to the next level. W4 quietly exited stealth last week, but today the Ireland-headquartered company has divulged more details about its goals to grow Godot and make it accessible for a wider array of commercial use cases. On top of that, the company told TechCrunch that it has raised $8.5 million in seed funding to make its mission a reality, with backers including OSS Capital, Lux Capital, Sisu Game Ventures and -- somewhat notably -- Bob Young, the co-founder and former CEO of Red Hat, an enterprise-focused open source company that IBM went on to acquire for $34 billion in 2019.

[...] "Companies like Red Hat have proven that with the right commercial offerings on top, the appeal of using open source in enterprise environments is enormous," Linietsky said. "W4 intends to do this very same thing for the game industry." In truth, Godot is nowhere near having the kind of impact in gaming that Linux has had in the enterprise, but it's still early days -- and this is exactly where W4 could make a difference. [...] W4's core target market will be broad -- it's gunning for independent developers and small studios, as well as medium and large gaming companies. The problem that it's looking to solve, ultimately, is that while Godot is popular with hobbyists and indie developers, companies are hesitant to use the engine on commercial projects due to its inherent limitations -- currently, there is no easy way to garner technical support, discuss the product's development roadmap, or access any other kind of value-added service. [...]

"W4 will offer console ports to developers under very accessible terms," Linietsky said. "Independent developers won't need to pay upfront to publish, while for larger companies there will be commercial packages that include support." Elsewhere, W4 is developing a range of products and services which it's currently keeping under wraps, with Linietsky noting that they will most likely be announced at Game Developers Conference (GDC) in San Francisco next March. "The aim of W4 is to help developers overcome any problem developers may stumble upon while trying to use Godot commercially," Linietsky added. It's worth noting that there are a handful of commercial companies out there already, such as Lone Wolf Technology and Pineapple Works, that help developers get the most out of Godot -- including console porting. But Linietsky was keen to highlight one core difference between W4 and these incumbents: its expertise. "The main distinctive feature of W4 is that it has been created by the Godot project leadership, which are the individuals with the most understanding and insight about Godot and its community," he said.

Google

Google's Quantum Supremacy Challenged By Ordinary Computers, For Now (newscientist.com) 18

Google has been challenged by an algorithm that could solve a problem faster than its Sycamore quantum computer, which it used in 2019 to claim the first example of "quantum supremacy" -- the point at which a quantum computer can complete a task that would be impossible for ordinary computers. Google concedes that its 2019 record won't stand, but says that quantum computers will win out in the end. From a report: Sycamore achieved quantum supremacy in a task that involves verifying that a sample of numbers output by a quantum circuit have a truly random distribution, which it was able to complete in 3 minutes and 20 seconds. The Google team said that even the world's most powerful supercomputer at the time, IBM's Summit, would take 10,000 years to achieve the same result. Now, Pan Zhang at the Chinese Academy of Sciences in Beijing and his colleagues have created an improved algorithm for a non-quantum computer that can solve the random sampling problem much faster, challenging Google's claim that a quantum computer is the only practical way to do it. The researchers found that they could skip some of the calculations without affecting the final output, which dramatically reduces the computational requirements compared with the previous best algorithms. The researchers ran their algorithm on a cluster of 512 GPUs, completing the task in around 15 hours. While this is significantly longer than Sycamore, they say it shows that a classical computer approach remains practical.
IT

Confronting an Ancient Indian Hierarchy, Apple and IBM Ban Discrimation By Caste (reuters.com) 181

"Apple, the world's biggest listed company, updated its general employee conduct policy about two years ago to explicitly prohibit discrimination on the basis of caste," reports Reuters, "which it added alongside existing categories such as race, religion, gender, age and ancestry.

Apple has more than 165,000 full-time employees, the article points out, and "The inclusion of the new category, which hasn't been previously reported, goes beyond U.S. discrimination laws, which do not explicitly ban casteism." The update came after the tech sector — which counts India as its top source of skilled foreign workers — received a wake-up call in June 2020 when California's employment regulator sued Cisco Systems on behalf of a low-caste engineer who accused two higher-caste bosses of blocking his career.... Since the suit was filed, several activist and employee groups have begun seeking updated U.S. discrimination legislation — and have also called on tech companies to change their own policies to help fill the void and deter casteism....

Elsewhere in tech, IBM told Reuters that it added caste, which was already in India-specific policies, to its global discrimination rules after the Cisco lawsuit was filed, though it declined to give a specific date or a rationale.

Meta, Amazon, and Google do not mention caste in internal polices, the article points out — but they all told Reuters it's already prohibited by their current policies against discrimination.

And yet, "Over 1,600 Google workers demanded the addition of caste to the main workplace code of conduct worldwide in a petition, seen by Reuters, which they emailed to CEO Sundar Pichai last month and re-sent last week after no response."
Red Hat Software

From Software Developer To CEO: Red Hat's Matt Hicks On His Journey To the Top (zdnet.com) 17

ZDNet's Stephanie Condon spoke with Red Hat's new CEO, Matt Hicks, a veteran of the company that's been working there for over 14 years. An anonymous reader shares an excerpt from their discussion: Matt Hicks, Red Hat's new CEO, doesn't have the background of your typical chief executive. He studied computer hardware engineering in college. He began his career as an IT consultant at IBM. His on-the-ground experience, however, is one of his core assets as the company's new leader, Hicks says. "The markets are changing really quickly," he tells ZDNet. "And just having that intuition -- of where hardware is going, having spent time in the field with what enterprise IT shops struggle with and what they do well, and then having a lot of years in Red Hat engineering -- I know that's intuition that I'll lean on... Around that, there's a really good team at Red Hat, and I get to lean on their expertise of how to best deliver, but that I love having that core intuition."

Hicks believes his core knowledge helps him to guide the company's strategic bets. While his experience is an asset, Hicks says it's not a given that a good developer will make a good leader. You also need to know how to communicate your ideas persuasively. "You can't just be the best coder in the room," he says. "Especially in STEM and engineering, the softer skills of learning how to present, learning how to influence a group and show up really well in a leadership presentation or at a conference -- they really start to define people's careers."

Hicks says that focus on influence is an important part of his role now that he didn't relish earlier in his career. "I think a lot of people don't love that," he says. "And yet, you can be the best engineer on the planet and work hard, but if you can't be heard, if you can't influence, it's harder to deliver on those opportunities." Hicks embraced the art of persuasion to advance his career. And as an open-source developer, he learned to embrace enterprise products to advance Red Hat's mission. He joined Red Hat just a few years after Paul Cormier -- then Red Hat's VP of engineering, and later Hicks' predecessor as CEO -- moved the company from its early distribution, Red Hat Linux, to Red Hat Enterprise Linux (RHEL). It was a move that not everyone liked. [...]
"As he settles into his new role as CEO, the main challenge ahead of Hicks will be picking the right industries and partners to pursue at the edge," writes Condon. "Red Hat is already working at the edge, in a range of different industries. It's working with General Motors on Ultifi, GM's end-to-end software platform, and it's partnering with ABB, one of the world's leading manufacturing automation companies. It's also working with Verizon on hybrid mobile edge computing. Even so, the opportunity is vast. Red Hat expects to see around $250 billion in spending at the edge by 2025."

"There'll be a tremendous growth of applications that are written to be able to deliver to that," Hicks says. "And so our goals in the short term are to pick the industries and build impactful partnerships in those industries -- because it's newer, and it's evolving."

Slashdot Top Deals