Intel

Intel Inks Deal with Department of Defense To Support Domestic Chip-Building Ecosystem (techcrunch.com) 28

Intel has signed a deal with the Department of Defense to support a domestic commercial chip-building ecosystem. The chipmaker will lead the first phase of a program called Rapid Assured Microelectronics Prototypes - Commercial (RAMP-C), which aims to bolster the domestic semiconductor supply chain. From a report: The chipmaker's recently launched division, Intel Foundry Services, will lead the program. As part of RAMP-C, Intel will partner with IBM, Cadence, Synopsys and others to establish a domestic commercial foundry ecosystem. Intel says the program was designed to create custom integrated circuits and commercial products required by the Department of Defense's systems. "The RAMP-C program will enable both commercial foundry customers and the Department of Defense to take advantage of Intel's significant investments in leading-edge process technologies," said Randhir Thakur, president of Intel Foundry Services, in a statement. "Along with our customers and ecosystem partners, including IBM, Cadence, Synopsys and others, we will help bolster the domestic semiconductor supply chain and ensure the United States maintains leadership in both R&D and advanced manufacturing."
Technology

Mastercard To Become First Payments Network To Phase Out Magnetic Stripe (mastercard.com) 125

Mastercard, writing in a blog post: In the early age of modern credit cards, they had to write down account information for each card-carrying customer by hand. Later, they used flatbed imprinting machines to record the card information on carbon paper packets, the sound of the swiping of the handle earning them the name, zip-zap machines. (They were also dubbed "knuckle-busters" by the unfortunate clerks who skinned their fingers on the embossing plate.) And how could clerks tell whether the customer was good for the purchase? They couldn't. Credit card companies would circulate a list of bad account numbers each month, and the merchant would have to compare the customers' cards against the list.

The arrival of the magnetic stripe changed all that. An early 1960s innovation largely credited to IBM, the magnetic stripe allowed banks to encode card information onto magnetic tape laminated to the back. It paved the way for electronic payment terminals and chip cards, offering more security and real-time authorization while making it easier for businesses of all sizes to accept cards. That thin stripe has remained a fixture on billions of payment cards for decades, even as technology has evolved. But now the magnetic stripe is reaching its expiration date with Mastercard becoming the first payments network to phase it out.

The shift away from the magnetic stripe points to both consumers changing habits for payments and the development of newer technologies. Today's chip cards are powered by microprocessors that are much more capable and secure, and many are also embedded with tiny antennae that enable contactless transactions. Biometric cards, which combine fingerprints with chips to verify a cardholder's identity, offer another layer of security. Based on the decline in payments powered by magnetic stripes after chip-based payments took hold, newly-issued Mastercard credit and debit cards will not be required to have a stripe starting in 2024 in most markets. By 2033, no Mastercard credit and debit cards will have magnetic stripes, which leaves a long runway for the remaining partners who still rely on the technology to phase in chip card processing.

Hardware

Nokia's Smartphone: 25 Years Since it Changed the World (dw.com) 17

The Nokia 9000 Communicator -- "the office in your back pocket" -- was a smartphone even before the word was invented. It has been 25 years since it revolutionized the market. DW: Nokia presented its 9000 Communicator at the CeBIT 1996 computer fair in Hanover, Germany, and launched on August 15 of that year. "The office in your back pocket" added to the IBM Simon from 1994 and the HP OmniGo 700LX from March 1996. The 9000 Communicator was a smartphone even before the word had been invented. For a decade, the device was ââwhat a smartphone was supposed to look like. After the Communicator, Blackberry perfected the idea -- until Apple's iPhone with its multitouch screen in 2007 came along.

Opened like a minilaptop, with a keyboard and a black-and-white display with a diagonal of just 11.5 centimeters (4.5 inches), the retrofuturistic-looking device was made famous by actor Val Kilmer in the remake of the film The Saint. The 9000 Communicator was the first device to offer a combination of keyboard, quality screen, and business and internet software in one package. It had for the first time all of the features of a computer on a phone, putting email, web browsing, fax, word processing and spreadsheets into a single pocketable device.

Hardware

Samsung is Using AI to Design a Smartphone Chip. Will Others Follow? (arstechnica.com) 60

"Samsung is using artificial intelligence to automate the insanely complex and subtle process of designing cutting-edge computer chips," reports Wired: The South Korean giant is one of the first chipmakers to use AI to create its chips. Samsung is using AI features in new software from Synopsys, a leading chip design software firm used by many companies...

Others, including Google and Nvidia, have talked about designing chips with AI. But Synopsys' tool, called DSO.ai, may prove the most far-reaching because Synopsys works with dozens of companies. The tool has the potential to accelerate semiconductor development and unlock novel chip designs, according to industry watchers. Synopsys has another valuable asset for crafting AI-designed chips: years of cutting-edge semiconductor designs that can be used to train an AI algorithm. A spokesperson for Samsung confirms that the company is using Synopsys AI software to design its Exynos chips, which are used in smartphones, including its own branded handsets, as well as other gadgets...

Chipmakers including Nvidia and IBM are also dabbling in AI-driven chip design. Other makers of chip-design software, including Cadence, a competitor to Synopsys, are also developing AI tools to aid with mapping out the blueprints for a new chip.

But Synopsys's co-CEO tells Wired that Samsung's chip will be "the first of a real commercial processor design with AI."
IBM

The IBM PC Turns 40 (theregister.com) 117

The Register's Richard Speed commemorates the 40th anniversary of the introduction of the IBM Model 5150: IBM was famously late to the game when the Model 5150 (or IBM PC) put in an appearance. The likes of Commodore and Apple pretty much dominated the microcomputer world as the 1970s came to a close and the 1980s began. Big Blue, on the other hand, was better known for its sober, business-orientated products and its eyewatering price tags. However, as its customers began eying Apple products, IBM lumbered toward the market, creating a working group that could dispense with the traditional epic lead-times of Big Blue and take a more agile approach. A choice made was to use off-the-shelf hardware and software and adopt an open architecture. A significant choice, as things turned out.

Intel's 8088 was selected over the competition (including IBM's own RISC processor) and famously, Microsoft was tapped to provide PC DOS as well as BASIC that was included in the ROM. So this marks the 40th anniversary of PC DOS, aka MS-DOS, too. You can find Microsoft's old MS-DOS source code here. The basic price for the 5150 was $1,565, with a fully loaded system rising to more than $3,000. Users could enjoy high resolution monochrome text via the MDA card or some low resolution graphics (and vaguely nauseating colors) through a CGA card (which could be installed simultaneously.) RAM landed in 16 or 64kB flavors and could be upgraded to 256kB while the Intel 8088 CPU chugged along at 4.77 MHz.

Storage came courtesy of up to two 5.25" floppy disks, and the ability to attach a cassette recorder -- an option swiftly stripped from later models. There was no hard disk, and adding one presented a problem for users with deep enough pockets: the motherboard and software didn't support it and the power supply was a bit weedy. IBM would resolve this as the PC evolved. Importantly, the motherboard also included slots for expansion, which eventually became known as the Industry Standard Architecture (ISA) bus as the IBM PC clone sector exploded. IBM's approach resulted in an immense market for expansion cards and third party software.
While the Model 5150 "sold like hotcakes," Speed notes that it was eventually discontinued in 1987.
Businesses

These People Who Work From Home Have a Secret: They Have Two Jobs (wsj.com) 168

When the pandemic freed employees from having to report to the office, some saw an opportunity to double their salary on the sly. From a report: They were bored. Or worried about layoffs. Or tired of working hard for a meager raise every year. They got another job offer. Now they have a secret. A small, dedicated group of white-collar workers, in industries from tech to banking to insurance, say they have found a way to double their pay: Work two full-time remote jobs, don't tell anyone and, for the most part, don't do too much work, either. Alone in their home offices, they toggle between two laptops. They play "Tetris" with their calendars, trying to dodge endless meetings. Sometimes they log on to two meetings at once. They use paid time off -- in some cases, unlimited -- to juggle the occasional big project or ramp up at a new gig. Many say they don't work more than 40 hours a week for both jobs combined. They don't apologize for taking advantage of a system they feel has taken advantage of them.

[...] Gig work and outsourcing have been on the rise for years. Inflation is now ticking up, chipping away at spending power. Some employees in white-collar fields wonder why they should bother spending time building a career. "The harder that you work, it seems like the less you get," one of the workers with two jobs says. "People depend on you more. My paycheck is the same." Overemployed says it has a solution. "There's no implied lifetime employment anymore, not even at IBM," writes one of the website's co-founders, a 38-year-old who works for two tech companies in the San Francisco Bay Area. The site serves up tips on setting low expectations with bosses, staying visible at meetings and keeping LinkedIn profiles free of red flags. (A "social-media cleanse" is a solid excuse for an outdated LinkedIn profile, it says.) In a chat on the messaging platform Discord, people from around the world swap advice about employment checks and downtime at various brand-name companies.

Medicine

IBM's AI Can Predict How Parkinson's Disease May Progress In Individuals (engadget.com) 7

An anonymous reader quotes a report from Engadget: [R]esearchers from IBM and Michael J. Fox Foundation (MJFF) say they've developed a program that can predict how the symptoms of a Parkinson's disease patient will progress in terms of both timing and severity. In The Lancet Digital Health journal, they claim the software could transform how doctors help patients manage their symptoms by allowing them to better predict how the disease will progress. The breakthrough wouldn't have been possible without the Parkinson's Progression Markers Initiative, a study the Michael J. Fox Foundation sponsored. IBM describes the dataset, which includes information on more than 1,400 individuals, as the "largest and most robust volume of longitudinal Parkinson's patient data to date" and says it allowed its AI model to map out complex symptom and progression patterns.
Bug

Everyone Cites That 'Bugs Are 100x More Expensive To Fix in Production' Research, But the Study Might Not Even Exist (theregister.com) 118

"Software research is a train wreck," says Hillel Wayne, a Chicago-based software consultant who specialises in formal methods, instancing the received wisdom that bugs are way more expensive to fix once software is deployed. Wayne did some research, noting that "if you Google 'cost of a software bug' you will get tons of articles that say 'bugs found in requirements are 100x cheaper than bugs found in implementations.' They all use this chart from the 'IBM Systems Sciences Institute'... There's one tiny problem with the IBM Systems Sciences Institute study: it doesn't exist." The Register: Laurent Bossavit, an Agile methodology expert and technical advisor at software consultancy CodeWorks in Paris, has dedicated some time to this matter, and has a post on GitHub called "Degrees of intellectual dishonesty". Bossavit referenced a successful 1987 book by Roger S Pressman called Software Engineering: a Practitioner's Approach, which states: "To illustrate the cost impact of early error detection, we consider a series of relative costs that are based on actual cost data collected for large software projects [IBM81]." The reference to [IBM81] notes that the information comes from "course notes" at the IBM Systems Sciences Institute. Bossavit discovered, though, that many other publications have referenced Pressman's book as the authoritative source for this research, disguising its tentative nature.

Bossavit took the time to investigate the existence of the IBM Systems Science Institute, concluding that it was "an internal training program for employees." No data was available to support the figures in the chart, which shows a neat 100x the cost of fixing a bug once software is in maintenance. "The original project data, if any exist, are not more recent than 1981, and probably older; and could be as old as 1967," said Bossavit, who also described "wanting to crawl into a hole when I encounter bullshit masquerading as empirical support for a claim, such as 'defects cost more to fix the later you fix them'."

IBM

What Ever Happened to IBM's Watson? (nytimes.com) 75

After Watson triumphed on the gameshow Jeopardy in 2011, its star scientist had to convince IBM that it wasn't a magic answer box, and "explained that Watson was engineered to identify word patterns and predict correct answers for the trivia game."

The New York Times looks at what's happened in the decade since: Watson has not remade any industries. And it hasn't lifted IBM's fortunes. The company trails rivals that emerged as the leaders in cloud computing and A.I. — Amazon, Microsoft and Google. While the shares of those three have multiplied in value many times, IBM's stock price is down more than 10 percent since Watson's "Jeopardy!" triumph in 2011.... The company's missteps with Watson began with its early emphasis on big and difficult initiatives intended to generate both acclaim and sizable revenue for the company, according to many of the more than a dozen current and former IBM managers and scientists interviewed for this article... The company's top management, current and former IBM insiders noted, was dominated until recently by executives with backgrounds in services and sales rather than technology product experts. Product people, they say, might have better understood that Watson had been custom-built for a quiz show, a powerful but limited technology...

IBM insists that its revised A.I. strategy — a pared-down, less world-changing ambition — is working... But the grand visions of the past are gone. Today, instead of being a shorthand for technological prowess, Watson stands out as a sobering example of the pitfalls of technological hype and hubris around A.I. The march of artificial intelligence through the mainstream economy, it turns out, will be more step-by-step evolution than cataclysmic revolution.

One example: IBM technologists approached cancer medical centers, but "were frustrated by the complexity, messiness and gaps in the genetic data at the cancer center... At the end of last year, IBM discontinued Watson for Genomics, which grew out of the joint research with the University of North Carolina. It also shelved another cancer offering, Watson for Oncology, developed with another early collaborator, the Memorial Sloan Kettering Cancer Center..." IBM continued to invest in the health industry, including billions on Watson Health, which was created as a separate business in 2015. That includes more than $4 billion to acquire companies with medical data, billing records and diagnostic images on hundreds of millions of patients. Much of that money, it seems clear, they are never going to get back. Now IBM is paring back Watson Health and reviewing the future of the business. One option being explored, according to a report in The Wall Street Journal, is to sell off Watson Health...

Many outside researchers long dismissed Watson as mainly a branding campaign. But recently, some of them say, the technology has made major strides... The business side of Watson also shows signs of life. Now, Watson is a collection of software tools that companies use to build A.I.-based applications — ones that mainly streamline and automate basic tasks in areas like accounting, payments, technology operations, marketing and customer service. It is workhorse artificial intelligence, and that is true of most A.I. in business today. A core Watson capability is natural language processing — the same ability that helped power the "Jeopardy!" win. That technology powers IBM's popular Watson Assistant, used by businesses to automate customer service inquiries...

IBM says it has 40,000 Watson customers across 20 industries worldwide, more than double the number four years ago. Watson products and services are being used 140 million times a month, compared with a monthly rate of about 10 million two years ago, IBM says. Some of the big customers are in health, like Anthem, a large insurer, which uses Watson Assistant to automate customer inquiries.

"Adoption is accelerating," Mr. Thomas said.

Open Source

Libre-SOC's Open Hardware 180nm ASIC Submitted To IMEC for Fabrication (openpowerfoundation.org) 38

"We're building a chip. A fast chip. A safe chip. A trusted chip," explains the web page at Libre-SOC.org. "A chip with lots of peripherals. And it's VPU. And it's a 3D GPU... Oh and here, have the source code."

And now there's big news, reports long-time Slashdot reader lkcl: Libre-SOC's entirely Libre 180nm ASIC, which can be replicated down to symbolic level GDS-II with no NDAs of any kind, has been submitted to IMEC for fabrication.

It is the first wholly-independent Power ISA ASIC outside of IBM to go Silicon in 12 years. Microwatt went to Skywater 130nm in March; however, it is also developed by IBM, as an exceptionally well-made Reference Design, which Libre-SOC used for verification.

Whilst it would seem that Libre-SOC is jumping on the chip-shortage era's innovation bandwagon, Libre-SOC has actually been in development for over three and a half years so far. It even pre-dates the OpenLane initiative, and has the same objectives: fully automated HDL to GDS-II, full transparency and auditability with Libre VLSI tools Coriolis2 and Libre Cell Libraries from Chips4Makers.

With €400,000 in funding from the NLNet Foundation [a long-standing non-profit supporting privacy, security, and the "open internet"], plus an application to NGI Pointer under consideration, the next steps are to continue development of Draft Cray-style Vectors (SVP64) to the already supercomputer-level Power ISA, under the watchful eye of the upcoming OpenPOWER ISA Workgroup.

Open Source

Experimental Rust Support Patches Submitted to Linux Kernel Mailing List (theregister.com) 55

"The Rust for Linux project, sponsored by Google, has advanced..." reported the Register earlier this week: A new set of patches submitted to the Linux kernel mailing list summarizes the progress of the project to enable Rust to be used alongside C for implementing the Linux kernel. The progress is significant.

- ARM and RISC-V architectures are now supported, thanks to work on rustc_codgen_gcc, which is a GCC codegen for rustc. This means that rustc does the initial compilation of Rust code but GCC (the GNU Compiler Collection) does the backend compilation, enabling support for the architectures that GCC supports...

- Overall, "the Rust support is still to be considered experimental. However, as noted back in April, support is good enough that kernel developers can start working on the Rust abstractions for subsystems and write drivers and other modules," continued project leader Miguel Ojeda, a computer scientist at CERN in Geneva, Switzerland, now working full time on Rust for Linux...

There is substantial support for the project across the industry. Google said in April "we feel that Rust is now ready to join C as a practical language for implementing the kernel" and that it would reduce the number of potential bugs and security vulnerabilities. Google is sponsoring Ojeda to work full time on the project for a year, via the ISRG (Internet Security Research Group), which said last month that it is part of "efforts to move the internet's critical software infrastructure to memory safe code," under the project name Prossimo. The ISRG is also the nonprofit organisation behind Let's Encrypt free security certificates. Ojeda also mentioned that Microsoft's Linux Systems Group is contributing and hopes to submit "select Hyper-V drivers written in Rust." Arm is promising assistance with Rust for Linux on ARM-based systems. IBM has contributed Rust kernel support for its PowerPC processor.

More detail is promised at the forthcoming Linux Plumber's Conference in September. In the meantime, the project is on GitHub here.

"In addition, we would like to announce that we are organizing a new conference that focuses on Rust and the Linux kernel..." Ojeda posted. "Details will be announced soon." And for context, the Register adds: Linus Torvalds has said on several occasions that he welcomes the possibility of using Rust alongside C for kernel development, and told IT Wire in April that it is "getting to the point where maybe it might be mergeable for 5.14 or something like that."
United States

New York City Launches a Cyberdefense Center in Manhattan (engadget.com) 14

Infrastructure cyberattacks are quickly becoming a significant problem in the US, and New York City is opening a facility that could help fend off those potentially dangerous hacks. From a report: The Wall Street Journal reports that NYC has launched a long-in-the-making Cyber Critical Services and Infrastructure (CCSI) operations center in Manhattan to defend against major cyberattacks. The initiative's members are a mix of public and private sector organizations that include Amazon, the Federal Reserve Bank, IBM, the New York Police Department and multiple healthcare providers. If a cyberattack hits, they'll ideally cooperate closely to both overcome the attack and muster a city response if the digital offensive hobbles NYC's infrastructure. Politicians first floated the idea in 2017, but CCSI has been a strictly virtual initiative until now.
Businesses

Jim Whitehurst Steps Down as President at IBM Just 14 Months After Taking Role (techcrunch.com) 57

In a surprise announcement today, IBM announced that Jim Whitehurst, who came over in the Red deal, would be stepping down as company president just 14 months after taking over in that role. From a report: IBM didn't give a lot of details as to why he was stepping away, but acknowledged his key role in helping bring the 2018 $34 billion Red Hat deal to fruition and helping bring the two companies together after the deal closed. "Jim has been instrumental in articulating IBM's strategy, but also, in ensuring that IBM and Red Hat work well together and that our technology platforms and innovations provide more value to our clients," the company stated.

He will stay on as a senior advisor to Krishna, but it begs the question why he is leaving after such a short time in the role, and what he plans to do next. Oftentimes after a deal of this magnitude closes, there is an agreement as to how long key executives will stay. It could be simply that the period has expired and Whitehurst wants to move on, but some saw him as the heir apparent to Krishna and the move comes as a surprise when looked at in that context.

IBM

IBM's 18-Month Company-Wide Email System Migration Has Been a Disaster (theregister.com) 117

An anonymous reader quotes a report from The Register: IBM's planned company-wide email migration has gone off the rails, leaving many employees unable to use email or schedule calendar events. And this has been going on for several days. Current and former IBMers have confirmed to The Register that the migration, 18 months in the making, has been a disaster. We've been told that email service has been intermittent for the past four or five days, and not everyone has been affected in the same way. Lack of access has been shorter for some -- one source told us email was back after two days of downtime. Slack is said to be working though Outlook, Verse (IBM's webmail), and Notes have been unreliable.

"Outlook won't work with the new system, IBM Notes won't work and the online email called Verse has now gone down," a tipster told us. "Everyone has been affected and no fix is in sight." One source we spoke with laid the blame on IBM CFO James Kavanaugh for cutting costs and not hiring the right people. Over the weekend, a source told us, a blog post to IBM's internal network w3 said the migration had been planned for 18 months and that everything should go fine provided everyone follows the instructions emailed to them. Evidently, this did not happen. Since then, a banner has gone up on w3 pointing people to a Slack channel for updates on the situation, and IBM's CIO has posted a note to employees addressing the problems. Presently, the w3 status page returns an error. We're told that the migration plan followed from IBM's decision in 2018 to sell various software products, including Notes, to India-based HCL Technologies. Following the sale, Big Blue didn't want its data on HCL's servers. "They were transitioning to IBM-owned servers," one source told us. "That's where it broke down."
There's also concern that "disappeared messages may not be restored," says The Register. "We've even heard that IBM employees have been approached by recruiters posing questions like, 'Why are you still at IBM? They can't even get email straight.'"
Hardware

Quantum-Computing Startup Rigetti To Offer Modular Processors (arstechnica.com) 10

An anonymous reader quotes a report from Ars Technica: A quantum-computing startup announced Tuesday that it will make a significant departure in its designs for future quantum processors. Rather than building a monolithic processor as everyone else has, Rigetti Computing will build smaller collections of qubits on chips that can be physically linked together into a single functional processor. This isn't multiprocessing so much as a modular chip design. The decision has several consequences, both for Rigetti processors and quantum computing more generally. We'll discuss them below.

Rigetti's computers rely on a technology called a "transmon," based on a superconducting wire loop linked to a resonator. That's the same qubit technology used by larger competitors like Google and IBM. Transmons are set up so that the state of one can influence that of its neighbors during calculations, an essential feature of quantum computing. To an extent, the topology of connections among transmon qubits is a key contributor to the machine's computational power. Two other factors that currently hold back performance are the error rate of individual qubits and the qubit count. Scaling up the qubit count can boost the computational power of a processor -- but only if all the added qubits are of sufficiently high quality that the error rate doesn't limit the ability to perform accurate computations. Once qubit counts reach the thousands, error correction becomes possible, which changes the process significantly. At the moment, though, we're stuck with less than 100 qubits. So this is change is still in the indefinite future.

For Rigetti, the ability to merge several smaller processors -- which it has already shown it can produce -- into a single larger one should let it run up its qubit count relatively rapidly. In today's announcement, the company expects that an 80-qubit processor will be available within the next few months. (For context, IBM's roadmap includes plans for a 127-qubit processor sometime this year.) The other advantage of moving away from a monolithic design is that most chips tend to have one or more qubits that are either defective or have an unacceptably high error rate. By going with a modular design, the consequences of that are reduced. Rigetti can manufacture a large collection of modules and assemble chips from those with the fewest defects. Alternately, the company can potentially select for the modules that have qubits with low error rates and build the equivalent of an all-star processor. The reduced error rate could possibly offset the impact of a lower qubit count.

Science

How Quantum Computers are Already Untangling Nature's Mysteries (wired.co.uk) 36

Wired published a long extract from Amit Katwala's book Quantum Computing: How It Works and How It Could Change the World — explaining how it's already being put to use to explore some of science's biggest secrets by simulating nature itelf: Some of the world's top scientists are engaged in a frantic race to find new battery technologies that can replace lithium-ion with something cleaner, cheaper and more plentiful. Quantum computers could be their secret weapon... Although we've known all the equations we need to simulate chemistry since the 1930s, we've never had the computing power available to do it...

In January 2020, researchers at IBM published an early glimpse of how quantum computers could be useful in the Noisy Intermediate-Scale Quantum Computing era. Working with the German car manufacturer Daimler on improving batteries for electric vehicles, they used a small-scale quantum computer to simulate the behaviour of three molecules containing lithium, which could be used in the next generation of lithium-sulphur batteries that promise to be more powerful and cheaper than today's power cells..

Some other examples:
  • "Chemistry challenges just waiting for a quantum computer powerful and reliable enough to crack them range from the extraction of metals by catalysis through to carbon dioxide fixation, which could be used to capture emissions and slow climate change. But the one with the potential for the biggest impact might be fertiliser production... Some plants rely on bacteria which use an enzyme called nitrogenase to 'fix' nitrogen from the atmosphere and incorporate it into ammonia. Understanding how this enzyme works would be an important step towards...creating less energy-intensive synthetic fertilisers."
  • "Solar panels are another area where quantum computers could help, by accelerating the search for new materials. This approach could also help to identify new materials for batteries, and superconductors that work at room temperature, which would drive advances in motors, magnets and perhaps even quantum computers themselves...."
  • "Quantum computing could help scientists model complex interactions and processes in the body, enabling the discovery of new treatments for diseases such as Alzheimer's, or a quicker understanding of new diseases such as Covid-19. Artificial intelligence is already being used by companies such as DeepMind to gain insight into protein folding — a key facet of growth and disease — and quantum computers will accelerate this effort."

AI

Robotic AI-Powered Ship Tries Retracing Mayflower's Voyage, Has to Turn Back (apnews.com) 34

Check out this video footage of the sleek Mayflower 400 slicing through the water, hoping to retrace the historic 1620 journey of the famous ship which carried pilgrims to America. Unfortunately, unlike the real Mayflower, this robotic 21st-century doppelganger "had to turn back Friday to fix a mechanical problem," reports the Associated Press: Nonprofit marine research organization ProMare, which worked with IBM to build the autonomous ship, said it made the decision to return to base "to investigate and fix a minor mechanical issue" but hopes to be back on the trans-Atlantic journey as soon as possible.

With no humans on board the ship, there's no one to make repairs while it's at sea.

Piloted by artificial intelligence technology, the 50-foot (15-meter) Mayflower Autonomous Ship began its trip early Tuesday, departing from Plymouth, England, and spending some time off the Isles of Scilly before it headed for deeper waters.

It was supposed to take up to three weeks to reach Provincetown on Cape Cod before making its way to Plymouth, Massachusetts. If successful, it would be the largest autonomous vessel to cross the Atlantic.

Linux

Linux Foundation Readies Global COVID Certificate Network (zdnet.com) 131

An anonymous reader quotes a report from ZDNet: The Linux Foundation Public Health (LFPN) is getting the Global COVID Certificate Network (GCCN) ready for deployment. The GCCN [...] really is a coronavirus vaccine passport. It will do this by establishing a global trust registry network. This will enable interoperable and trustworthy exchanges of COVID certificates among countries for safe reopening and provide related technology and guidance for implementation. It's being built by the Linux Foundation Public Health and its allies, Affinidi, AOKPass, Blockchain Labs, Evernym, IBM, Indicio.Tech, LACChain, Lumedic, Proof Market, and ThoughtWorks. These companies have already implemented COVID certificate or pass systems for governments and industries. Together they will define and implement GCCN. This, it's hoped, will be the model for a true international vaccine registry.

Once completed, the GCCN's trust registry network will enable each country to publish a list of the authorized issuers of COVID certificates that can be digitally verified by authorities in other countries. This will bridge the gap between technical specifications (e.g. W3C Verifiable Credentials or SMART Health Card) and a complete trust architecture required for safe reopening. This is vital because as Brian Behlendorf, the Linux Foundation's General Manager for Blockchain, Healthcare, and Identity explained, "The first wave of apps for proving one's COVID status did not allow that proof to be shown beyond a single state or nation, did not avoid vendor lock-in and did not distinguish between rich health data and simple passes. The Blueprint gives this industry a way to solve those issues while meeting a high bar for privacy and integrity, and GCCN turns those plans into action."

Once in place, the GCCN will support Global COVID Certificates (GCC). These certificates will have three use cases: Vaccination, recovery from infection, and test results. They will be available in both paper and digital formats. Participating governments and industry alliances will decide what COVID certificates they issue and accept. The GCC schema definitions and minimal datasets will follow the recommendations of the Blueprint, as well as GCCN's technical and governance documents, implementation guide, and open-source reference implementations, which will be developed in collaboration with supporting organizations and the broader LFPH community. Besides setting the specs and designs, the GCCN community will also offer peer-based implementation and governance guidance to governments and industries to help them implement COVID certificate systems. This will include how to build national and state trust registries and infrastructure. They'll also provide guidance on how to leverage GCC into their existing coronavirus vaccine systems.

IBM

Will Labor Shortages Give Workers More Power? (msn.com) 174

It's been argued that technology (especially automation) will continue weakening the position of workers. But today the senior economics correspondent for The New York Times argues a "profound shift" happening in America is instead something else.

"For the first time in a generation, workers are gaining the upper hand..." Up and down the wage scale, companies are becoming more willing to pay a little more, to train workers, to take chances on people without traditional qualifications, and to show greater flexibility in where and how people work. The erosion of employer power began during the low-unemployment years leading up to the pandemic and, given demographic trends, could persist for years. March had a record number of open positions, according to federal data that goes back to 2000, and workers were voluntarily leaving their jobs at a rate that matches its historical high. Burning Glass Technologies, a firm that analyzes millions of job listings a day, found that the share of postings that say "no experience necessary" is up two-thirds over 2019 levels, while the share of those promising a starting bonus has doubled.

People are demanding more money to take a new job. The "reservation wage," as economists call the minimum compensation workers would require, was 19 percent higher for those without a college degree in March than in November 2019, a jump of nearly $10,000 a year, according to a survey by the Federal Reserve Bank of New York... [T]he demographic picture is not becoming any more favorable for employers eager to fill positions. Population growth for Americans between ages 20 and 64 turned negative last year for the first time in the nation's history. The Congressional Budget Office projects that the potential labor force will grow a mere 0.3 to 0.4 percent annually for the remainder of the 2020s; the size of the work force rose an average of 0.8 percent a year from 2000 to 2020.

The article describes managers now "being forced to learn how to operate amid labor scarcity... At the high end of the labor market, that can mean workers are more emboldened to leave a job if employers are insufficiently flexible on issues like working from home..."

But it also notes a ride-sharing driver who switched to an IBM apprenticeship for becoming a cloud storage engineer, and former Florida nightclub bouncer Alex Lorick, who became an IBM mainframe technician, "part of a deliberate effort by IBM to rethink how it hires and what counts as a qualification for a given job." [IBM] executives concluded that the qualifications for many jobs were unnecessarily demanding. Postings might require applicants to have a bachelor's degree, for example, in jobs that a six-month training course would adequately prepare a person for.

"By creating your own dumb barriers, you're actually making your job in the search for talent harder," said Obed Louissaint, IBM's senior vice president for transformation and culture. In working with managers across the company on training initiatives like the one under which Mr. Lorick was hired, "it's about making managers more accountable for mentoring, developing and building talent versus buying talent."

"I think something fundamental is changing, and it's been happening for a while, but now it's accelerating," Mr. Louissaint said.

China

The Woman Who Mastered IBM's 5,400-character Chinese Typewriter (fastcompany.com) 58

Fast Company's technology editor harrymcc writes: In the 1940s, IBM tried to market a typewriter capable of handling all 5,400 Chinese characters. The catch was that using it required memorizing a 4-digit code for each character. But a young woman named Lois Lew tackled the challenge and demoed the typewriter for the company in presentations from Manhattan to Shanghai.

More than 70 years later, Lew, now in her 90s, told her remarkable story to Thomas S. Mullaney for Fast Company.

Slashdot Top Deals