IBM

IBM Sells Some Watson Health Assets for More Than $1 Billion (bloomberg.com) 17

IBM agreed to sell part of its IBM Watson Health business to private equity firm Francisco Partners, scaling back the technology company's once-lofty ambitions in health care. From a report: The value of the assets being sold, which include extensive and wide-ranging data sets and products, and image software offerings, is more than $1 billion, according to people familiar with the plans. The deal "is a clear next step as IBM becomes even more focused on our platform-based hybrid cloud and AI strategy," said Tom Rosamilia, senior vice president, IBM Software. "IBM remains committed to Watson, our broader AI business, and to the clients and partners we support in healthcare IT." IBM launched Watson Health in 2015 with the aim of using its core artificial intelligence platform to help health care providers analyze troves of data and ultimately revolutionize cancer treatment. Many of the company's ambitions haven't panned out, though, and some customers have complained that its products didn't match the hype. Even after spending roughly $4 billion in acquisitions to prop up the initiative, Watson hasn't delivered the kind of progress IBM initially envisioned and the unit wasn't profitable. Last year, the Wall Street Journal reported the unit generated about $1 billion of annual revenue.
Input Devices

The Origin of the Blinking Cursor (inverse.com) 99

Long-time Slashdot reader jimminy_cricket shares a new article from the technology site Inverse exploring the origin of blinking cursors.

They trace the invention to the 1960s and electronics engineer Charles Kiesling, a naval veteran of the Korean War who "spent his immediate post-war years on a new challenge: the exploding computing age." Still decades away from personal computers — let alone portable ones — Kiesling was joining the ranks of engineers tinkering with room-sized computers like the IBM 650 or the aging ENIAC. He joined Sperry Rand, now Unisys, in 1955, and helped develop the kind of computer guts that casual users rarely think about. This includes innards like logic circuitry, which enable your computer to make complex conditional decisions like "or," "and," or "if only" instead of simply "yes" or "no". One of these seemingly innocuous advancements was a 1967 patent filing Kiesling made for a blinking cursor...."

According to a post on a computer science message board from a user purporting to be Kiesling's son, the inspiration for this invention was simply utility. "I remember him telling me the reason behind the blinking cursor, and it was simple," Kiesling's son writes. "He said there was nothing on the screen to let you know where the cursor was in the first place. So he wrote up the code for it so he would know where he was ready to type on the Cathode Ray Tube."

The blinking, it turns out, is simply a way to catch the coders' attention and stand apart from a sea of text.

The article credits Apple with popularizing blinking cursors to the masses. And it also remembers a fun story about Steve Jobs (shared by Thomas Haigh, a professor of technology history at the University of Wisconsin-Milwaukee): While he was in support of the blinking cursor itself, Haigh says Steve Jobs was famously against controlling it using cursor keys. Jobs attempted — and failed — to remove these keys from the original Mac in an effort to force users into using a mouse instead. In an interaction with biographer Walter Isaacson years later, he even pried them off with his car keys before signing his autograph on the keyboard.
IBM

IBM Tries To Sell Watson Health Again (axios.com) 17

IBM has resurrected its sale process for IBM Watson Health, with hopes of fetching more than $1 billion, people familiar with the situation told Axios. From the report: Big Blue wants out of health care, after spending billions to stake its claim, just as rival Oracle is moving big into the sector via its $28 billion bet for Cerner. IBM spent more than $4 billion to build Watson Health via a series of acquisitions. The business now includes health care data and analytics business Truven Health Analytics, population health company Phytel, and medical imaging business Merge Healthcare. IBM first explored a sale of the division in early 2021, with Morgan Stanley leading the process. WSJ reported at the time that the unit was generating roughly $1 billion in annual revenue, but was unprofitable. Sources say it continues to lose money.
Programming

'A Quadrillion Mainframes On Your Lap' (ieee.org) 101

"Your laptop is way more powerful than you might realize," writes long-time Slashdot reader fahrbot-bot.

"People often rhapsodize about how much more computer power we have now compared with what was available in the 1960s during the Apollo era. Those comparisons usually grossly underestimate the difference."

Rodney Brooks, emeritus professor of robotics at MIT (and former director of their AI Lab and CSAIL) explains in IEEE Spectrum: By 1961, a few universities around the world had bought IBM 7090 mainframes. The 7090 was the first line of all-transistor computers, and it cost US $20 million in today's money, or about 6,000 times as much as a top-of-the-line laptop today. Its early buyers typically deployed the computers as a shared resource for an entire campus. Very few users were fortunate enough to get as much as an hour of computer time per week.

The 7090 had a clock cycle of 2.18 microseconds, so the operating frequency was just under 500 kilohertz. But in those days, instructions were not pipelined, so most took more than one cycle to execute. Some integer arithmetic took up to 14 cycles, and a floating-point operation could hog up to 15. So the 7090 is generally estimated to have executed about 100,000 instructions per second. Most modern computer cores can operate at a sustained rate of 3 billion instructions per second, with much faster peak speeds. That is 30,000 times as fast, so a modern chip with four or eight cores is easily 100,000 times as fast.

Unlike the lucky person in 1961 who got an hour of computer time, you can run your laptop all the time, racking up more than 1,900 years of 7090 computer time every week....

But, really, this comparison is unfair to today's computers. Your laptop probably has 16 gigabytes of main memory. The 7090 maxed out at 144 kilobytes. To run the same program would require an awful lot of shuffling of data into and out of the 7090 — and it would have to be done using magnetic tapes . The best tape drives in those days had maximum data-transfer rates of 60 KB per second. Although 12 tape units could be attached to a single 7090 computer, that rate needed to be shared among them. But such sharing would require that a group of human operators swap tapes on the drives; to read (or write) 16 GB of data this way would take three days. So data transfer, too, was slower by a factor of about 100,000 compared with today's rate.

So now the 7090 looks to have run at about a quadrillionth (10 ** -15) the speed of your 2021 laptop. A week of computing time on a modern laptop would take longer than the age of the universe on the 7090.

Power

IBM and Samsung Say Their New Chip Design Could Lead To Week-Long Battery Life On Phones (theverge.com) 85

IBM and Samsung have announced their latest advance in semiconductor design: a new way to stack transistors vertically on a chip (instead of lying flat on the surface of the semiconductor). The Verge reports: The new Vertical Transport Field Effect Transistors (VTFET) design is meant to succeed the current FinFET technology that's used for some of today's most advanced chips and could allow for chips that are even more densely packed with transistors than today. In essence, the new design would stack transistors vertically, allowing for current to flow up and down the stack of transistors instead of the side-to-side horizontal layout that's currently used on most chips. Vertical designs for semiconductors have been a trend for a while (FinFET already offers some of those benefits); Intel's future roadmap also looks to move in that direction, too, although its initial work focused on stacking chip components rather than individual transistors. It makes sense, after all: when you've run out of ways to add more chips in one plane, the only real direction (other than physically shrinking transistor technology) is to go up.

While we're still a ways away from VTFET designs being used in actual consumer chips, the two companies are making some big claims, noting that VTFET chips could offer a "two times improvement in performance or an 85 percent reduction in energy use" compared to FinFET designs. And by packing more transistors into chips, IBM and Samsung claim that VTFET technology could help keep Moore's law's goal of steadily increasing transistor count moving forward. IBM and Samsung are also citing some ambitious possible use cases for the new technology, raising the idea of "cell phone batteries that could go over a week without being charged, instead of days," less energy-intensive cryptocurrency mining or data encryption, and even more powerful IoT devices or even spacecraft.

Technology

How Much Has Quantum Computing Actually Advanced? (ieee.org) 13

For a measured perspective on how much quantum computing is actually advancing as a field, IEEE Spectrum spoke with John Martinis, a professor of physics at the University of California, Santa Barbara, and the former chief architect of Google's Sycamore. From a report: IEEE Spectrum: So it's been about two years since you unveiled results from Sycamore. In the last few weeks, we've seen announcements of a 127-qubit chip from IBM and a 256-qubit neutral atom quantum computer from QuEra. What kind of progress would you say has actually been made?
John Martinis: Well, clearly, everyone's working hard to build a quantum computer. And it's great that there are all these systems people are working on. There's real progress. But if you go back to one of the points of the quantum supremacy experiment -- and something I've been talking about for a few years now -- one of the key requirements is gate errors. I think gate errors are way more important than the number of qubits at this time. It's nice to show that you can make a lot of qubits, but if you don't make them well enough, it's less clear what the advance is. In the long run, if you want to do a complex quantum computation, say with error correction, you need way below 1% gate errors. So it's great that people are building larger systems, but it would be even more important to see data on how well the qubits are working. In this regard, I am impressed with the group in China who reproduced the quantum supremacy results, where they show that they can operate their system well with low errors.

Intel

Intel's Expensive New Plan to Upgrade Its Chip Technology - and US Manufacturing (cnet.com) 131

America's push to manufacturer more products domestically gets an in-depth look from CNET — including a new Intel chip factory outside of Phoenix.

CNET calls it a fork in the road "after squandering its lead because of a half decade of problems modernizing its manufacturing..." With "a decade of bad decisions, this doesn't get fixed overnight," says Pat Gelsinger, Intel's new chief executive, in an interview. "But the bottom is behind us and the slope is starting to feel increasingly strong...." More fabs are on the way, too. In an enormous empty patch of dirt at its existing Arizona site, Intel has just begun building fabs 52 and 62 at a total cost of $20 billion, set to make Intel's most advanced chips, starting in 2024. Later this year, it hopes to announce the U.S. location for its third major manufacturing complex, a 1,000-acre site costing about $100 billion. The spending commitment makes this year's $3.5 billion upgrade to its New Mexico fab look cheap. The goal is to restore the U.S. share of chip manufacturing, which has slid from 37% in 1990 to 12% today. "Over the decade in front of us, we should be striving to bring the U.S. to 30% of worldwide semiconductor manufacturing," Gelsinger says...

But returning Intel to its glory days — and anchoring a resurgent U.S. electronics business in the process — is much easier said than done. Making chips profitably means running fabs at maximum capacity to pay off the gargantuan investments required to stay at the leading edge. A company that can't keep pace gets squeezed out, like IBM in 2014 or Global Foundries in 2018. To catch up after its delays, Intel now plans to upgrade its manufacturing five times in the next four years, a breakneck pace by industry standards. "This new roadmap that they announced is really aggressive," says Linley Group analyst Linley Gwennap. "I don't have any idea how they are going to accomplish all of that...."

Gelsinger has a tech-first recovery plan. He's pledged to accelerate manufacturing upgrades to match the technology of TSMC and Samsung by 2024 and surpass them in 2025. He's opening Intel's fabs to other companies that need chips built through its new Intel Foundry Services (IFS). And he's relying on other foundries, including TSMC, for about a quarter of Intel's near-term chipmaking needs to keep its chips more competitive during the upgrades. This three-pronged strategy is called IDM (integrated design and manufacturing) 2.0. That's a new take on Intel's philosophy of both designing and making chips. It's more ambitious than the future some had expected, in which Intel would sell its factories and join the ranks of "fabless" chip designers like Nvidia, AMD and Qualcomm that rely on others for manufacturing...

Shareholders may not like Gelsinger's spending-heavy strategy, but one community really does: Intel's engineers... Gelsigner told the board that Intel is done with stock buybacks, a financial move in which a company uses its cash to buy stock and thereby increase its price. "We're investing in factories," he told me. "That's going to be the use of our cash...."

"We cannot recall the last time Intel put so many stakes in the ground," said BMO Capital Markets analyst Ambrish Srivastava in a July research report after Intel announced its schedule.

Intel will even outpace Moore's law, Gelsinger tells CNET — more than doubling the transistor count on processors every two years. "I believe that you're going to see from 2025 to 2035 a very healthy period for Moore's Law-like behavior."

Although that still brings some risk to Intel's investments if they have to pass the costs on to customer, a Linley Group analyst points out to CNET. "Moore's Law is not going to end when we can't build smaller transistors. It's going to end when somebody says I don't want to pay for smaller transistors."
Supercomputing

Japan's Fugaku Retains Title As World's Fastest Supercomputer (datacenterdynamics.com) 13

According to a report from Nikkei Asia (paywalled), "The Japanese-made Fugaku captured its fourth consecutive title as the world's fastest supercomputer on Tuesday, although a rival from the U.S. or China is poised to steal the crown as soon as next year." From a report: But while Fugaku is the world's most powerful public supercomputer, at 442 petaflops, China is believed to secretly operate two exascale (1,000 petaflops) supercomputers, which were launched earlier this year. The top 10 list did not change much since the last report six months ago, with only one new addition -- a Microsoft Azure system called Voyager-EUS2. Voyager, featuring AMD Epyc CPUs and Nvidia A100 GPUs, achieved 30.05 petaflops, making it the tenth most powerful supercomputer in the world.

The other systems remained in the same position - after Japan's Arm-based Fugaku comes the US Summit system, an IBM Power and Nvidia GPU supercomputer capable of 148 petaflops. The similarly-architected 94 petaflops US Sierra system is next. Then comes what is officially China's most powerful supercomputer, the 93 petaflops Sunway TaihuLight, which features Sunway chips. The Biden administration sanctioned the company earlier this year.
You can read a summary of the systems in the Top10 here.
Intel

The Chip That Changed the World (wsj.com) 97

The world changed on Nov. 15, 1971, and hardly anyone noticed. It is the 50th anniversary of the launch of the Intel 4004 microprocessor, a computer carved onto silicon, an element as plentiful on earth as sand on a beach. Microprocessors unchained computers from air-conditioned rooms and freed computing power to go wherever it is needed most. Life has improved exponentially since. From a report: Back then, IBM mainframes were kept in sealed rooms and were so expensive companies used argon gas instead of water to put out computer-room fires. Workers were told to evacuate on short notice, before the gas would suffocate them. Feeding decks of punch cards into a reader and typing simple commands into clunky Teletype machines were the only ways to interact with the IBM computers. Digital Equipment Corp. sold PDP-8 minicomputers to labs and offices that weighed 250 pounds. In 1969, Nippon Calculating Machine Corp. asked Intel to design 12 custom chips for a new printing calculator. Engineers Federico Faggin, Stanley Mazor and Ted Hoff were tired of designing different chips for various companies and suggested instead four chips, including one programmable chip they could use for many products. Using only 2,300 transistors, they created the 4004 microprocessor. Four bits of data could move around the chip at a time. The half-inch-long rectangular integrated circuit had a clock speed of 750 kilohertz and could do about 92,000 operations a second.

Intel introduced the 3,500-transistor, eight-bit 8008 in 1972; the 29,000-transistor, 16-bit 8086, capable of 710,000 operations a second, was introduced in 1978. IBM used the next iteration, the Intel 8088, for its first personal computer. By comparison, Apple's new M1 Max processor has 57 billion transistors doing 10.4 trillion floating-point operations a second. That is at least a billionfold increase in computer power in 50 years. We've come a long way, baby. When I met Mr. Hoff in the 1980s, he told me that he once took his broken television to a repairman, who noted a problem with the microprocessor. The repairman then asked why he was laughing. Now that everyone has a computer in his pocket, one of my favorite movie scenes isn't quite so funny. In "Take the Money and Run" (1969), Woody Allen's character interviews for a job at an insurance company and his interviewer asks, "Have you ever had any experience running a high-speed digital electronic computer?" "Yes, I have." "Where?" "My aunt has one."

IBM

IBM Claims Quantum Computing Breakthrough (axios.com) 79

Axios reports: IBM has created a quantum processor able to process information so complex the work can't be done or simulated on a traditional computer, CEO Arvind Krishna told "Axios on HBO" ahead of a planned announcement.

Why it matters: Quantum computing could help address problems that are too challenging for even today's most powerful supercomputers, such as figuring out how to make better batteries or sequester carbon emissions.

Driving the news: IBM says its new Eagle processor can handle 127 qubits, a measure of quantum computing power. In topping 100 qubits, IBM says it has reached a milestone that allows quantum to surpass the power of a traditional computer. "It is impossible to simulate it on something else, which implies it's more powerful than anything else," Krishna told "Axios on HBO...."

Krishna says the quantum computing push is one part of his approach to return the company to growth.

"Some people think of it as science fiction," Krishna says of quantum computing in a preview clip from the Axios interview. "I think of it now as a feat of engineering."
China

Have Scientists Disproven Google's Quantum Supremacy Claim? (scmp.com) 35

Slashdot reader AltMachine writes: In October 2019, Google said its Sycamore processor was the first to achieve quantum supremacy by completing a task in three minutes and 20 seconds that would have taken the best classical supercomputer, IBM's Summit, 10,000 years. That claim — particularly how Google scientists arrived at the "10,000 years" conclusion — has been questioned by some researchers, but the counterclaim itself was not definitive.

Now though, in a paper to be submitted to a scientific journal for peer review, scientists at the Institute of Theoretical Physics under the Chinese Academy of Sciences said their algorithm on classical computers completed the simulation for the Sycamore quantum circuits [possibly paywalled; alternative source of the same article] "in about 15 hours using 512 graphics processing units (GPUs)" at a higher fidelity than Sycamore's. Further, the team said "if our simulation of the quantum supremacy circuits can be implemented in an upcoming exaflop supercomputer with high efficiency, in principle, the overall simulation time can be reduced to a few dozens of seconds, which is faster than Google's hardware experiments".

As China unveiled a photonic quantum computer which solved a Gaussian boson sampling problem in 200 seconds that would have taken 600 million years on classical computer, in December 2020, disproving Sycamore's claim would place China being the first country to achieve quantum supremacy.

IBM

Last of Original SCO v IBM Linux Lawsuit Settled (zdnet.com) 126

"[N]ow, after SCO went bankrupt; court after court dismissing SCO's crazy copyright claims; and closing in on 20-years into the saga, the U.S. District Court of Utah has finally put a period to the SCO vs. IBM lawsuit," writes ZDNet's Steven J. Vaughan-Nichols. From the report: According to the Court, since: "All claims and counterclaims in this matter, whether alleged or not alleged, pleaded or not pleaded, have been settled, compromised, and resolved in full, and for good cause appearing, IT IS HEREBY ORDERED that the parties' Motion is GRANTED. All claims and counterclaims in this action, whether alleged or not alleged, pleaded or not pleaded, have been settled, compromised, and resolved in full, and are DISMISSED with prejudice and on the merits. The parties shall bear their own respective costs and expenses, including attorneys' fees. The Clerk is directed to close the action." Finally!

Earlier, the US Bankruptcy Court for the District of Delaware, which has been overseeing SCO's bankruptcy had announced that the TSG Group, which represents SCO's debtors, has settled with IBM and resolved all the remaining claims between TSG and IBM: "Under the Settlement Agreement, the Parties have agreed to resolve all disputes between them for a payment to the Trustee [TLD], on behalf of the Estates [IBM], of $14,250,000." In return, TLD gives up all rights and interests in all litigation claims pending or that may be asserted in the future against IBM and Red Hat, and any allegations that Linux violates SCO's Unix intellectual property.
"While we're one step closer, the SCO lawsuits still live on just like one of those Halloween monsters that just won't die," concludes Vaughan-Nichols, noting the lawsuit Xinuos filed against IBM and Red Hat in March for allegedly copying their software code for its server operating systems. "But, in this go-around, there aren't many people in the audience."
Red Hat Software

Red Hat Forced To Hire Cheaper, Less Senior Engineers Amid Budget Freeze (theregister.com) 133

Next year, IBM's Red Hat plans to cut back on hiring senior engineers in an effort aimed largely at controlling costs. The Register reports: An internal email sent on Wednesday by Timothy Cramer, SVP of software engineering, to Red Hat managers directs hiring requisitions to be made at a lower level of seniority than usual. "All new plan reqs should be opened at a level below senior (e.g., Associate Software Engineer or Software Engineer)," the message says. "While this change allows us to use our budget more effectively, it also helps us balance the organization as we have many engineers with senior titles. We recognize that this will mean we need to plan for training and mentoring, promotions, and internal mobility as well, and we are here to support you in that."

The hiring budget update also says that current requisitions and backfills -- positions vacated that need to be filled -- should be offered at a reduced level. "All current reqs and future backfills will be down-leveled by one level by default (e.g., Senior Software Engineer to Software Engineer)," the memo explained. [...] Our source expressed concern that this decision, which applies to new hires, will harm the company. If Red Hat is unable to offer competitive pay or hire senior people, our source suggested, that's likely to limit the company's access to talent and to make it more difficult to retain existing skilled employees. "The best talent wants to work with other like-minded and skilled people," our source said.

IBM

McDonald's Partners With IBM To Automate Drive-Thru Lanes (cnbc.com) 118

McDonald's said Wednesday it has entered a strategic partnership with IBM to develop artificial intelligence technology that will help the fast-food chain automate its drive-thru lanes. CNBC reports: As part of the deal, IBM will acquire McD Tech Labs, which was formerly known as Apprente before McDonald's bought the tech company in 2019. McDonald's didn't disclose financial terms for either transaction. "In my mind, IBM is the ideal partner for McDonald's given their expertise in building AI-powered customer care solutions and voice recognition," McDonald's CEO Chris Kempczinski said on the earnings call with analysts Wednesday.

The Apprente technology uses AI to understand drive-thru orders. This summer, McDonald's tested the tech in a handful of Chicago restaurants. Kempczinski said that the test showed "substantial benefits" to customers and employees. In June, at the same conference where he disclosed the Chicago test, Kempczinski shared McDonald's strategy for tech acquisitions. "If we do acquisitions, it will be for a short period of time, bring it in house, jumpstart it, turbo it and then spin it back out and find a partner that will work and scale it for us," he said. CFO Kevin Ozan said that less than 100 employees will leave McDonald's to work for IBM.

Cloud

Alliance Including Amazon, Google, Microsoft, and IBM Vows to Protect Rights and Privacy With 'Trusted Cloud Principles' (zdnet.com) 33

ZDNet reports: Some of the world's largest tech giants — Amazon, Google, Microsoft, IBM, Salesforce/Slack, Atlassian, SAP, and Cisco — have joined forces to establish the Trusted Cloud Principles in what they are claiming is their commitment to protecting the rights of their customers... Some of the specific principles that have been founded by the signatories include governments should seek data directly from enterprise customers first, rather than cloud providers, other than in "exceptional circumstances"; customers should have a right to notice when governments seek to access customer data directly from cloud service providers; and there should be a clear process for cloud providers to challenge government access requests for customers' data, including notifying relevant data protection authorities, to protect customers' interests.

Also outlined in the principles is the point that governments should create mechanisms to raise and resolve conflicts with each other such that cloud service providers' legal compliance in one country does not amount to a violation of law in another; and governments should support cross-border data flows. At the same time, the cloud service providers acknowledge that under the principles they recognise international human rights law enshrines a right to privacy, and the importance of customer trust and customers' control and security of their data. The signatories also said they commit to supporting laws that allow governments to request data through a transparent process that abides by human right standards; international legal frameworks to resolve conflicting laws related to data access, privacy, and sovereignty; and improved rules and regulations at the national and international levels that protect the safety, privacy, and security of cloud customers and their ownership of data...

The Trusted Cloud Principles come days after a separate data cloud framework was stood up between Amazon Web Services, Google, IBM, Microsoft and other major tech giants, plus the EDM Council, a cross-industry trade association for data management and analytics. Under the Cloud Data Management Capabilities (CDMC) framework there are six components, 14 capabilities, and 37 sub-capabilities that sets out cloud data management capabilities, standards, and best practices for cloud, multi-cloud, and hybrid-cloud implementations while also incorporating automated key controls for protecting sensitive data.

IBM

After IBM Failed To Sail an Autonomous Boat Across the Atlantic, It's Trying Again (washingtonpost.com) 69

After failing its first attempt to re-create the Mayflower's voyage across the Atlantic Ocean, a crewless ocean vessel, powered by artificial intelligence, has returned to sea. From a report: Propelled by IBM's AI software, the autonomous ship set out in June for a month-long excursion through rough waters with no humans aboard. However, three days into what was supposed to be a monumental journey from Plymouth, England, to Plymouth, Mass., where pilgrim travelers settled in 1620, the robot ship suffered "a minor mechanical issue" according to ProMare, a nonprofit promoting marine research that is behind the project. Researchers pushed out a software update, signaling for the ship to reverse course. The boat abided by its orders and headed to shore. Yet according to Brett Phaneuf, co-director of the Mayflower Autonomous Ship Project, the organizers quickly began planning another voyage. "We've had a setback, but one that will put us further ahead than if we did nothing," he said. Earlier this month, researchers sent the ship back out for a shorter trip: This time it'll focus on the waters around the United Kingdom, where crews can attend to it sooner if something unforeseen happens. "At some point, you have to go for it and take the risk or never improve," Phaneuf said.
Businesses

The IT Talent Gap is Still Growing (venturebeat.com) 109

IT executives see the talent shortage as the most significant adoption barrier to 64% of emerging technologies, according to a new Gartner survey. From a report: Across compute infrastructure and platform services, network, security, digital workplace, IT automation, and storage and database, respondents cited a lack of qualified candidates as a leading factor impeding tech deployment at their companies. "The ongoing push toward remote work and the acceleration of hiring plans in 2021 has exacerbated IT talent scarcity, especially for sourcing skills that enable cloud and edge, automation, and continuous delivery," Gartner research VP Yinuo Geng said in a press release.

"As one example, of all the IT automation technologies profiled in the survey, only 20% of them have moved ahead in the adoption cycle since 2020. The issue of talent is to blame here." The talent gaps are particularly acute for IT automation and digital workplace solutions, according to the executives surveyed -- a reflection of the demand for these technologies. According to McKinsey, nearly half of executives say their embrace of automation has accelerated, while digital and technology adoption is taking place about 25 times faster than before the pandemic. For example, Brain Corp reported that the use of robots to clean retail stores in the U.S. rose 24% in Q2 2020 year-over-year, and IBM has seen a surge in new users of its AI-driven customer service platform Watson Assistant.

Robotics

Astronauts In Space Will Soon Resurrect An AI Robot Friend Called CIMON (space.com) 17

A robot called CIMON-2 (short for Crew Interactive Mobile Companion) has received a software update that will enable it to perform more complex tasks with a new human crewmate later this year. Space.com reports: The cute floating sphere with a cartoon-like face has been stored at the space station since the departure of the European Space Agency's (ESA) astronaut Luca Parmitano in February 2020. The robot will wake up again during the upcoming mission of German astronaut Matthias Maurer, who will arrive at the orbital outpost with the SpaceX Crew-3 Dragon mission in October. In the year and a half since the end of the last mission, engineers have worked on improving CIMON's connection to Earth so that it could provide a more seamless service to the astronauts, CIMON project manager Till Eisenberg at Airbus, which developed the intelligent robot together with the German Aerospace Centre DLR and the LMU University in Munich, told Space.com.

"The sphere is just the front end," Eisenberg said. "All the voice recognition and artificial intelligence happens on Earth at an IBM data centre in Frankfurt, Germany. The signal from CIMON has to travel through satellites and ground stations to the data centre and back. We focused on improving the robustness of this connection to prevent disruptions." CIMON relies on IBM's Watson speech recognition and synthesis software to converse with astronauts and respond to their commands. The first generation robot flew to the space station with Alexander Gerst in 2018. That robot later returned to Earth and is now touring German museums. The current robot, CIMON-2, is a second generation. Unlike its predecessor, it is more attuned to the astronauts' emotional states (thanks to the Watson Tone Analyzer). It also has a shorter reaction time.

Airbus and DLR have signed a contract with ESA for CIMON-2 to work with four humans on the orbital outpost in the upcoming years. During those four consecutive missions, engineers will first test CIMON's new software and then move on to allowing the sphere to participate in more complex experiments. During these new missions CIMON will, for the first time, guide and document complete scientific procedures, Airbus said in a statement. "Most of the activities that astronauts perform are covered by step by step procedures," Eisenberg said. "Normally, they have to use clip boards to follow these steps. But CIMON can free their hands by floating close by, listening to the commands and reading out the procedures, showing videos, pictures and clarifications on its screen." The robot can also look up additional information and document the experiments by taking videos and pictures. The scientists will gather feedback from the astronauts to see how helpful the sphere really was and identify improvements for CIMON's future incarnations.

IBM

IBM's New Mainframe 7nm CPU Telum: 16 Cores At 5GHz, Virtual L3 and L4 Cache (arstechnica.com) 90

Long-time Slashdot reader UnknowingFool writes: Last week IBM announced their next generation mainframe CPU Telum. Manufactured by Samsung's 7nm node, each Telum processor has 8 cores with each core running at a base 5GHz. Two processors are combined in a package similar to AMD's chiplet design. A drawer in each mainframe can hold 4 packages (sockets), and the mainframe can hold 4 drawers for combined 256 cores.

Different from previous generations, there is no dedicated L3 or L4 cache. Instead each core has a 32MB L2 cache that can pool to become a 256MB L3 "virtual" cache on the same processor or 2GB L4 "virtual" cache on the same drawer. Also included to help with AI is a on-die but not on-core inference accelerator running at 6TFLOPS using Intel's AVX-512 to communicate with the cores.

Education

Code.org Will Teach 'Cybersecurity Hygiene' to Millions of Students 29

Long-time Slashdot reader theodp writes: Mr. President," Code.org founder Hadi Partovi told President Joe Biden and tech CEOs from Microsoft, Amazon, Google, Apple, and IBM at Wednesday's Presidential Summit on Cybersecurity, "America's cybersecurity problem is an education problem. I loved [Microsoft CEO] Satya Nadella's wonderful analogy to the car industry, and like Satya said, we need standards for seatbelts in every car for sure. But if none of the drivers took a course in basic safety skills, our roads could never, ever be safe. That's the current state of affairs on the roads of the internet. Without proper education, we can't address our nation's weakest link. If you look around, every CEO is nodding their head because they know we need a plan to educate every American on basic cyber security hygiene, and also a plan to staff up our cyber defense workforce. This needs to start early, in K-12, and reach everybody."

A newly-released White House Fact Sheet announcing "Ambitious Initiatives to Bolster the Nation's Cybersecurity" notes that tech-bankrolled "Code.org announced it will teach cybersecurity concepts to over 3 million students across 35,000 classrooms over 3 years, to teach a diverse population of students how to stay safe online, and to build interest in cybersecurity as a potential career."

Slashdot Top Deals