logo RSS Rabbit quadric
News that matters, fast.
Good luck, have news.
Happy scrolling!

Categories



Date/Time of Last Update: Sat Sep 24 18:00:45 2022 UTC




********** CLIMATE **********
return to top



UK environment laws under threat in ‘deregulatory free-for-all’
Fri, 23 Sep 2022 08:22:34 GMT

Campaigners say revoking of post-Brexit protections amounts to legislative vandalism

Hundreds of Britain’s environmental laws covering water quality, sewage pollution, clean air, habitat protections and the use of pesticides are lined up for removal from UK law under a government bill.

Environmentalists accused Liz Truss’s government of reneging on a commitment made after Brexit to halt the decline of nature by 2030. They say the revoking of 570 environmental laws that were rolled over from EU law after Brexit amounts to a deregulatory free-for-all leaving the environment unprotected.

Continue reading...
Match ID: 0 Score: 40.00 source: www.theguardian.com age: 1 day
qualifiers: 40.00 air pollution

Climate Change is NSF Engineering Alliance’s Top Research Priority
Tue, 20 Sep 2022 20:00:00 +0000


Since its launch in April 2021, the Engineering Research Visioning Alliance has convened a diverse set of experts to explore three areas in which fundamental research could have the most impact: climate change; the nexus of biology and engineering; and securing critical infrastructure against hackers.

To identify priorities for each theme, ERVA—an initiative funded by the U.S. National Science Foundation—holds what are termed visioning events, wherein IEEE members and hundreds of other experts from academia, industry, and nonprofits can conceptualize bold ideas. The results are distilled into reports that identify actionable priorities for engineering research pursuit. Reports from recent visioning events are slated to be released to the public in the next few months.


IEEE is one of more than 20 professional engineering societies that have joined ERVA as affiliate partners.

Research energy storage and greenhouse gas capture solutions

Identifying technologies to address the climate crisis was ERVA’s first theme. The theme was based on results of a survey ERVA conducted last year of the engineering community about what the research priorities should be.

“The resounding answer from the 500 respondents was climate change,” says Dorota Grejner-Brzezinska, EVRA’s principal investigator. She is a vice president for knowledge enterprise at Ohio State University, in Columbus.

During the virtual visioning event in December, experts explored solar and renewable energy, carbon sequestration, water management, and geoengineering. The climate change task force released its report last month.

These are some of the research areas ERVA said should be pursued:

  • Energy storage, transmission, and critical materials. The materials include those that are nanoengineered, ones that could be used for nontraditional energy storage, and those that can extract additional energy from heat cycles.
  • Greenhouse gas capture and elimination. Research priorities included capturing and eliminating methane and nitrous oxide released in agriculture operations.
  • Resilient, energy-efficient, and healthful infrastructure. One identified priority was research to develop low-cost coatings for buildings and roads to reduce heat effects and increase self-cooling.
  • Water, ecosystem, and geoengineering assessments. The report identifies research in creating sensing, measuring, and AI models to analyze the flow of water to ensure its availability during droughts and other disruptive events caused or worsened by climate change.

“The groundwork ERVA has laid out in this report creates a blueprint for funders to invest in,” Grejner-Brzezinska says, “and catalyzes engineering research for a more secure and sustainable world. As agencies and research organizations enact legislation to reduce carbon emissions and bolster clean-energy technologies, engineering is poised to lead with research and development.”

IEEE is developing a strategy to guide the organization’s response to the global threat.

Use biology and engineering to interrupt the transfer of viruses

A virtual visioning event on Leveraging Biology to Power Engineering Impact was held in March. The hope, as explained on the event’s website, is to transform research where biology and engineering intersect: health care and medicine, agriculture, and high tech.

“As agencies and research organizations enact legislation to reduce carbon emissions and bolster clean-energy technologies, engineering is poised to lead with research and development.”

The experts considered research directions in three areas: Use biology to inspire engineers to develop new components, adapt and adopt biological constructs beyond their original function, and create engineering systems and components that improve on biology. An example would be to interrupt the transfer of viruses from one species to another so as to reduce the spread of diseases.

The task force’s report on which research areas to pursue is scheduled to be released next month, according to Grejner-Brzezinska.

Protect infrastructure from hackers

One of today’s main engineering challenges, according to ERVA, is the protection of infrastructure against hackers and other threats. At the in-person visioning event held last month at MIT on the Engineering R&D Solutions for Unhackable Infrastructure theme, researchers discussed gaps in security technologies and looked at how to design trustworthy systems and how to build resilience into interdependent infrastructures.

ERVA describes unhackable as the ability to ensure safety, security, and trust in essential systems and services that society relies on.

The task force examined research themes related to physical infrastructure such as assets and hardware; software and algorithms; and data and communication networks. It also considered new security methods for users, operators, and security administrators to thwart cyberattacks.

Grejner-Brzezinska says the task force’s report will be released in mid-December.

Sustainable transportation networks

Planning has begun for the next visioning event, Sustainable Transportation Networks, to be held virtually on 2 and 3 November. The session is to explore innovative and sustainable transportation modes and the infrastructure networks needed to support them. Some of the areas to be discussed are green construction; longitudinal impact studies; interconnected transportation modes such as rail, marine, and air transport; and transportation equity.

Become an ERVA supporter

ERVA will convene four visioning events each year on broad engineering research themes that have the potential to solve societal challenges, Grejner-Brzezinska says. IEEE members who are experts in the fields can get involved by joining the ERVA Champions, now more than 900 strong. They are among the first to learn about upcoming visioning sessions and about openings to serve on volunteer groups such as thematic task forces, advisory boards, and standing councils. Members can sign up on the ERVA website.

“Becoming a champion is an opportunity to break out of your silos of disciplines and really come together with others in the engineering research community,” Grejner-Brzezinska says. “You can do what engineers do best: solve problems.”


Match ID: 1 Score: 25.71 source: spectrum.ieee.org age: 3 days
qualifiers: 12.86 climate change, 12.86 carbon

Interview: New UN climate chief takes the fight personally
Sat, 24 Sep 2022 13:43:02 EDT
The United Nations official now in charge of the fight to curb climate change has a personal stake in the battle to reduce emissions
Match ID: 2 Score: 15.00 source: www.washingtonpost.com age: 0 days
qualifiers: 15.00 climate change

This dash for growth represents the death of green Toryism
Sat, 24 Sep 2022 16:00:08 GMT

Boris Johnson was far more eco-conscious than recent Conservative predecessors. But this mini-budget is a reversion to type

The dash for growth by Kwasi Kwarteng means unshackling City bankers and property developers from the taxes and regulations that prevent them from paving over what’s left of Britain’s green and pleasant land.

The humble concrete mixer will be elevated to exalted status. There will be more executive homes built on greenfield sites. More distribution sheds dotted along busy A-roads. And more urban renewal of the kind that involves tearing down buildings in a plume of dust and carbon emissions to replace them with something not much better, at least not in environmental terms.

Continue reading...
Match ID: 3 Score: 15.00 source: www.theguardian.com age: 0 days
qualifiers: 15.00 carbon

Philadelphia’s Diatom Archive Is a Way, Way, Wayback Machine
Sat, 24 Sep 2022 12:00:00 +0000
A cache of phytoplankton at the Academy of Natural Sciences of Drexel University is helping researchers reconstruct historical coastlines.
Match ID: 4 Score: 15.00 source: www.wired.com age: 0 days
qualifiers: 15.00 climate change

Decarbonising the energy system by 2050 could save trillions - Oxford study
2022-09-24T08:31:56+00:00
Decarbonising the energy system by 2050 could save trillions - Oxford study submitted by /u/editorijsmi
[link] [comments]

Match ID: 5 Score: 15.00 source: www.reddit.com age: 0 days
qualifiers: 15.00 carbon

Mercedes’ F1 team cut its freight emissions by 89% with biofuel switch
Fri, 23 Sep 2022 14:47:08 +0000
16 trucks used biofuels to haul between the final three European races this year.
Match ID: 6 Score: 15.00 source: arstechnica.com age: 1 day
qualifiers: 15.00 carbon

Yeti CFO Paul Carbone resigning effective Oct. 28, shares dip 3.5% premarket
Fri, 23 Sep 2022 12:06:08 GMT

Yeti Holdings Inc. said Friday that Chief Financial Officer Paul Carbone is resigning effective Oct. 28, to pursue a business opportunity that will allow him to be closer to family in Boston. The provider of outdoor products such as coolers and drinkware and backpacks has commenced a search for a replacement. Shares were down 3.5% premarket and have fallen 65% in the year to date, while the S&P 500 has fallen 21%.

Market Pulse Stories are Rapid-fire, short news bursts on stocks and markets as they move. Visit MarketWatch.com for more information on this news.


Match ID: 7 Score: 15.00 source: www.marketwatch.com age: 1 day
qualifiers: 15.00 carbon

Mini-budget fell far short of promoting low-carbon future for UK
Fri, 23 Sep 2022 12:00:08 GMT

While not devoid of green measures, Kwarteng’s announcement was more notable for what it did not include

The chancellor, Kwasi Kwarteng, has announced that the effective ban on onshore wind farms is to be lifted, and the poorest households will regain access to insulation and energy efficiency measures.

Polls show that onshore wind is popular, with more than 70% of people supporting it. Jess Ralston, a senior analyst at the Energy and Climate Intelligence Unit, said: “The ban on onshore wind has been a major anomaly in British energy policy given it’s both cheap and popular with the public. So a decision to lift the ban suggests [Kwarteng] has listened to the experts and understands building more British renewables reduces our reliance on costly gas and so brings down bills.”

Continue reading...
Match ID: 8 Score: 15.00 source: www.theguardian.com age: 1 day
qualifiers: 15.00 carbon

Climate change risk to coastal castles - English Heritage
Fri, 23 Sep 2022 00:04:19 GMT
Rising sea levels are threatening ancient castles and forts at an accelerating rate, says English Heritage.
Match ID: 9 Score: 15.00 source: www.bbc.co.uk age: 1 day
qualifiers: 15.00 climate change

Climate change: Spike in Amazon emissions linked to law enforcement
Thu, 22 Sep 2022 23:00:23 GMT
Scientists say a huge increase in deforestation in the Amazon is linked to lax law enforcement.
Match ID: 10 Score: 15.00 source: www.bbc.co.uk age: 1 day
qualifiers: 15.00 climate change

Lawns Are Dumb. But Ripping Them Out May Come With a Catch
Thu, 22 Sep 2022 12:00:00 +0000
Meticulous turf is environmentally terrible. Yet grass does have one charm: It “sweats,” helping cool the local area.
Match ID: 11 Score: 15.00 source: www.wired.com age: 2 days
qualifiers: 15.00 climate change

Europe’s Heat Waves Offer a Grim Vision of the Future
Thu, 22 Sep 2022 11:00:00 +0000
Extreme temperatures are the direct result of climate change, which means more intense heat events, wildfires, and droughts to come.
Match ID: 12 Score: 15.00 source: www.wired.com age: 2 days
qualifiers: 15.00 climate change

UN chief: 'Tax fossil fuel profits for climate damage'
Tue, 20 Sep 2022 13:30:00 GMT
Tax fossil fuel companies' profits to pay for the damage done by climate change, says UN Secretary General.
Match ID: 13 Score: 10.71 source: www.bbc.co.uk age: 4 days
qualifiers: 10.71 climate change

We Can Now Train Big Neural Networks on Small Devices
Tue, 20 Sep 2022 13:02:00 +0000


The gadgets around us are constantly learning about our lives. Smartwatches pick up on our vital signs to track our health. Home speakers listen to our conversations to recognize our voices. Smartphones play grammarian, watching what we write in order to fix our idiosyncratic typos. We appreciate these conveniences, but the information we share with our gadgets isn’t always kept between us and our electronic minders. Machine learning can require heavy hardware, so “edge” devices like phones often send raw data to central servers, which then return trained algorithms. Some people would like that training to happen locally. A new AI training method expands the training capabilities of smaller devices, potentially helping to preserve privacy.

The most powerful machine-learning systems use neural networks, complex functions filled with tunable parameters. During training, a network receives an input (such as a set of pixels), generates an output (such as the label “cat”), compares its output with the correct answer, and adjusts its parameters to do better next time. To know how to tune each of those internal knobs, the network needs to remember the effect of each one, but they regularly number in the millions or even billions. That requires a lot of memory. Training a neural network can require hundreds of times the memory called upon when merely using one (also called “inference”). In the latter case, the memory is allowed to forget what each layer of the network did as soon as it passes information to the next layer.


To reduce the memory demanded during the training phase, researchers have employed a few tricks. In one, called paging or offloading, the machine moves those activations from short-term memory to a slower but more abundant type of memory such as flash or an SD card, then brings it back when needed. In another, called rematerialization, the machine deletes the activations, then computes them again later. Previously, memory-reduction systems used one of those two tricks or, says Shishir Patil, a computer scientist at the University of California, Berkeley, and the lead author of the paper describing the innovation, they were combined using “heuristics” that are “suboptimal,” often requiring a lot of energy. The innovation reported by Patil and his collaborators formalizes the combination of paging and rematerialization.

“Taking these two techniques, combining them well into this optimization problem, and then solving it—that’s really nice,” says Jiasi Chen, a computer scientist at the University of California, Riverside, who works on edge computing but was not involved in the work.

In July, Patil presented his system, called POET (private optimal energy training), at the International Conference on Machine Learning, in Baltimore. He first gives POET a device’s technical details and information about the architecture of a neural network he wants it to train. He specifies a memory budget and a time budget. He then asks it to create a training process that minimizes energy usage. The process might decide to page certain activations that would be inefficient to recompute but rematerialize others that are simple to redo but require a lot of memory to store.

One of the keys to the breakthrough was to define the problem as a mixed integer linear programming (MILP) puzzle, a set of constraints and relationships between variables. For each device and network architecture, POET plugs its variables into Patil’s hand-crafted MILP program, then finds the optimal solution. “A main challenge is actually formulating that problem in a nice way so that you can input it into a solver,” Chen says. “So, you capture all of the realistic system dynamics, like energy, latency, and memory.”

The team tested POET on four different processors, whose RAM ranged from 32 KB to 8 GB. On each, the researchers trained three different neural network architectures: two types popular in image recognition (VGG16 and ResNet-18), plus a popular language-processing network (BERT). In many of the tests, the system could reduce memory usage by about 80 percent, without a big bump in energy use. Comparable methods couldn’t do both at the same time. According to Patil, the study showed that BERT can now be trained on the smallest devices, which was previously impossible.

“When we started off, POET was mostly a cute idea,” Patil says. Now, several companies have reached out about using it, and at least one large company has tried it in its smart speaker. One thing they like, Patil says, is that POET doesn’t reduce network precision by “quantizing,” or abbreviating, activations to save memory. So the teams that design networks don’t have to coordinate with teams that implement them in order to negotiate trade-offs between precision and memory.

Patil notes other reasons to use POET besides privacy concerns. Some devices need to train networks locally because they have low or no Internet connection. These include devices used on farms, in submarines, or in space. Other setups can benefit from the innovation because data transmission requires too much energy. POET could also make large devices—Internet servers—more memory efficient and energy efficient. But as for keeping data private, Patil says, “I guess this is very timely, right?”


Match ID: 14 Score: 10.71 source: spectrum.ieee.org age: 4 days
qualifiers: 10.71 carbon

Satellite Imagery for Everyone
Sat, 19 Feb 2022 16:00:00 +0000


Every day, satellites circling overhead capture trillions of pixels of high-resolution imagery of the surface below. In the past, this kind of information was mostly reserved for specialists in government or the military. But these days, almost anyone can use it.

That’s because the cost of sending payloads, including imaging satellites, into orbit has dropped drastically. High-resolution satellite images, which used to cost tens of thousands of dollars, now can be had for the price of a cup of coffee.

What’s more, with the recent advances in artificial intelligence, companies can more easily extract the information they need from huge digital data sets, including ones composed of satellite images. Using such images to make business decisions on the fly might seem like science fiction, but it is already happening within some industries.


This image shows are variety of blue and green hues, interwoven in a geometrically intriguing way.

These underwater sand dunes adorn the seafloor between Andros Island and the Exuma islands in the Bahamas. The turquoise to the right reflects a shallow carbonate bank, while the dark blue to the left marks the edge of a local deep called Tongue of the Ocean. This image was captured in April 2020 using the Moderate Resolution Imaging Spectroradiometer on NASA’s Terra satellite.

Joshua Stevens/NASA Earth Observatory


Here’s a brief overview of how you, too, can access this kind of information and use it to your advantage. But before you’ll be able to do that effectively, you need to learn a little about how modern satellite imagery works.

The orbits of Earth-observation satellites generally fall into one of two categories: GEO and LEO. The former is shorthand for geosynchronous equatorial orbit. GEO satellites are positioned roughly 36,000 kilometers above the equator, where they circle in sync with Earth’s rotation. Viewed from the ground, these satellites appear to be stationary, in the sense that their bearing and elevation remain constant. That’s why GEO is said to be a geostationary orbit.

Such orbits are, of course, great for communications relays—it’s what allows people to mount satellite-TV dishes on their houses in a fixed orientation. But GEO satellites are also appropriate when you want to monitor some region of Earth by capturing images over time. Because the satellites are so high up, the resolution of that imagery is quite coarse, however. So these orbits are primarily used for observation satellites designed to track changing weather conditions over broad areas.

Being stationary with respect to Earth means that GEO satellites are always within range of a downlink station, so they can send data back to Earth in minutes. This allows them to alert people to changes in weather patterns almost in real time. Most of this kind of data is made available for free by the U.S. National Oceanographic and Atmospheric Administration.


This black-and-white image shows a narrow waterway blocked by a large ship. The resolution of the image is sufficient to make out individual shipping containers on its deck, as well as the tugboats arrayed around it.

In March 2021, the container ship Ever Given ran aground, blocking the Suez Canal for six days. This satellite image of the scene, obtained using synthetic-aperture radar, shows the kind resolution that is possible with this technology.

Capella Space


The other option is LEO, which stands for low Earth orbit. Satellites placed in LEO are much closer to the ground, which allows them to obtain higher-resolution images. And the lower you can go, the better the resolution you can get. The company Planet, for example, increased the resolution of its recently completed satellite constellation, SkySat, from 72 centimeters per pixel to just 50 cm—an incredible feat—by lowering the orbits its satellites follow from 500 to 450 km and improving the image processing.

The best commercially available spatial resolution for optical imagery is 25 cm, which means that one pixel represents a 25-by-25-cm area on the ground—roughly the size of your laptop. A handful of companies capture data with 25-cm to 1-meter resolution, which is considered high to very high resolution in this industry. Some of these companies also offer data from 1- to 5-meter resolution, considered medium to high resolution. Finally, several government programs have made optical data available at 10-, 15-, 30-, and 250-meter resolutions for free with open data programs. These include NASA/U.S. Geological Survey Landsat, NASA MODIS (Moderate Resolution Imaging Spectroradiometer), and ESA Copernicus. This imagery is considered low resolution.

Because the satellites that provide the highest-resolution images are in the lowest orbits, they sense less area at once. To cover the entire planet, a satellite can be placed in a polar orbit, which takes it from pole to pole. As it travels, Earth rotates under it, so on its next pass, it will be above a different part of Earth.

Many of these satellites don’t pass directly over the poles, though. Instead, they are placed in a near-polar orbit that has been specially designed to take advantage of a subtle bit of physics. You see, the spinning Earth bulges outward slightly at the equator. That extra mass causes the orbits of satellites that are not in polar orbits to shift or (technically speaking) to precess. Satellite operators often take advantage of this phenomenon to put a satellite in what’s called a sun-synchronous orbit. Such orbits allow the repeated passes of the satellite over a given spot to take place at the same time of day. Not having the pattern of shadows shift between passes helps the people using these images to detect changes.




It usually takes 24 hours for a satellite in polar orbit to survey the entire surface of Earth. To image the whole world more frequently, satellite companies use multiple satellites, all equipped with the same sensor and following different orbits. In this way, these companies can provide more frequently updated images of a given location. For example, Maxar’s Worldview Legion constellation, launching later this year, includes six satellites.

After a satellite captures some number of images, all that data needs to be sent down to Earth and processed. The time required for that varies.

DigitalGlobe (which Maxar acquired in 2017) recently announced that it had managed to send data from a satellite down to a ground station and then store it in the cloud in less than a minute. That was possible because the image sent back was of the parking lot of the ground station, so the satellite didn’t have to travel between the collection point and where it had to be to do the data “dumping,” as this process is called.

In general, Earth-observation satellites in LEO don’t capture imagery all the time—they do that only when they are above an area of special interest. That’s because these satellites are limited to how much data they can send at one time. Typically, they can transmit data for only 10 minutes or so before they get out of range of a ground station. And they cannot record more data than they’ll have time to dump.

Currently, ground stations are located mostly near the poles, the most visited areas in polar orbits. But we can soon expect distances to the nearest ground station to shorten because both Amazon and Microsoft have announced intentions to build large networks of ground stations located all over the world. As it turns out, hosting the terabytes of satellite data that are collected daily is big business for these companies, which sell their cloud services (Amazon Web Services and Microsoft’s Azure) to satellite operators.

For now, if you are looking for imagery of an area far from a ground station, expect a significant delay—maybe hours—between capture and transmission of the data. The data will then have to be processed, which adds yet more time. The fastest providers currently make their data available within 48 hours of capture, but not all can manage that. While it is possible, under ideal weather conditions, for a commercial entity to request a new capture and get the data it needs delivered the same week, such quick turnaround times are still considered cutting edge.


The best commercially available spatial resolution is 25 centimeters for optical imagery, which means that one pixel represents something roughly the size of your laptop.


I’ve been using the word “imagery,” but it’s important to note that satellites do not capture images the same way ordinary cameras do. The optical sensors in satellites are calibrated to measure reflectance over specific bands of the electromagnetic spectrum. This could mean they record how much red, green, and blue light is reflected from different parts of the ground. The satellite operator will then apply a variety of adjustments to correct colors, combine adjacent images, and account for parallax, forming what’s called a true-color composite image, which looks pretty much like what you would expect to get from a good camera floating high in the sky and pointed directly down.

Imaging satellites can also capture data outside of the visible-light spectrum. The near-infrared band is widely used in agriculture, for example, because these images help farmers gauge the health of their crops. This band can also be used to detect soil moisture and a variety of other ground features that would otherwise be hard to determine.

Longer-wavelength “thermal” IR does a good job of penetrating smoke and picking up heat sources, making it useful for wildfire monitoring. And synthetic-aperture radar satellites, which I discuss in greater detail below, are becoming more common because the images they produce aren’t affected by clouds and don’t require the sun for illumination.

You might wonder whether aerial imagery, say, from a drone, wouldn’t work at least as well as satellite data. Sometimes it can. But for many situations, using satellites is the better strategy. Satellites can capture imagery over areas that would be difficult to access otherwise because of their remoteness, for example. Or there could be other sorts of accessibility issues: The area of interest could be in a conflict zone, on private land, or in another place that planes or drones cannot overfly.

So with satellites, organizations can easily monitor the changes taking place at various far-flung locations. Satellite imagery allows pipeline operators, for instance, to quickly identify incursions into their right-of-way zones. The company can then take steps to prevent a disastrous incident, such as someone puncturing a gas pipeline while construction is taking place nearby.


\u200bThis satellite image shows a snow-covered area. A tongue of darker material is draped over the side of a slope, impinging on a nearby developed area with buildings.

This SkySat image shows the effect of a devastating landslide that took place on 30 December 2020. Debris from that landslide destroyed buildings and killed 10 people in the Norwegian village of Ask.

SkySat/Planet



The ability to compare archived imagery with recently acquired data has helped a variety of industries. For example, insurance companies sometimes use satellite data to detect fraudulent claims (“Looks like your house had a damaged roof when you bought it…”). And financial-investment firms use satellite imagery to evaluate such things as retailers’ future profits based on parking-lot fullness or to predict crop prices before farmers report their yields for the season.

Satellite imagery provides a particularly useful way to find or monitor the location of undisclosed features or activities. Sarah Parcak of the University of Alabama, for example, uses satellite imagery to locate archaeological sites of interest. 52Impact, a consulting company in the Netherlands, identified undisclosed waste dump sites by training an algorithm to recognize their telltale spectral signature. Satellite imagery has also helped identify illegal fishing activities, fight human trafficking, monitor oil spills, get accurate reporting on COVID-19 deaths, and even investigate Uyghur internment camps in China—all situations where the primary actors couldn’t be trusted to accurately report what’s going on.

Despite these many successes, investigative reporters and nongovernmental organizations aren’t yet using satellite data regularly, perhaps because even the small cost of the imagery is a deterrent. Thankfully, some kinds of low-resolution satellite data can be had for free.

The first place to look for free satellite imagery is the Copernicus Open Access Hub and EarthExplorer. Both offer free access to a wide range of open data. The imagery is lower resolution than what you can purchase, but if the limited resolution meets your needs, why spend money?

If you require medium- or high-resolution data, you might be able to buy it directly from the relevant satellite operator. This field recently went through a period of mergers and acquisitions, leaving only a handful of providers, the big three in the West being Maxar and Planet in the United States and Airbus in Germany. There are also a few large Asian providers, such as SI Imaging Services in South Korea and Twenty First Century Aerospace Technology in Singapore. Most providers have a commercial branch, but they primarily target government buyers. And they often require large minimum purchases, which is unhelpful to companies looking to monitor hundreds of locations or fewer.

Expect the distance to the nearest ground station to shorten because both Amazon and Microsoft have announced intentions to build large networks of ground stations located all over the world.

Fortunately, approaching a satellite operator isn’t the only option. In the past five years, a cottage industry of consultants and local resellers with exclusive deals to service a certain market has sprung up. Aggregators and resellers spend years negotiating contracts with multiple providers so they can offer customers access to data sets at more attractive prices, sometimes for as little as a few dollars per image. Some companies providing geographic information systems—including Esri, L3Harris, and Safe Software—have also negotiated reselling agreements with satellite-image providers.

Traditional resellers are middlemen who will connect you with a salesperson to discuss your needs, obtain quotes from providers on your behalf, and negotiate pricing and priority schedules for image capture and sometimes also for the processing of the data. This is the case for Apollo Mapping, European Space Imaging, Geocento, LandInfo, Satellite Imaging Corp., and many more. The more innovative resellers will give you access to digital platforms where you can check whether an image you need is available from a certain archive and then order it. Examples include LandViewer from EOS and Image Hunter from Apollo Mapping.

More recently, a new crop of aggregators began offering customers the ability to programmatically access Earth-observation data sets. These companies work best for people looking to integrate such data into their own applications or workflows. These include the company I work for, SkyWatch, which provides such a service, called EarthCache. Other examples are UP42 from Airbus and Sentinel Hub from Sinergise.

While you will still need to talk with a sales rep to activate your account—most often to verify you will use the data in ways that fits the company’s terms of service and licensing agreements—once you’ve been granted access to their applications, you will be able to programmatically order archive data from one or multiple providers. SkyWatch is, however, the only aggregator allowing users to programmatically request future data to be collected (“tasking a satellite”).

While satellite imagery is fantastically abundant and easy to access today, two changes are afoot that will expand further what you can do with satellite data: faster revisits and greater use of synthetic-aperture radar (SAR).

This image shows a sprawling compound of dozens of large buildings located in a desert area.

This image shows a race-track shaped structure with a tall chimney in the middle, built in an area where the ground is a distinctly reddish hue. Satellite images have helped to reveal China’s treatment of its Muslim Uyghur minority. About a million Uyghurs (and other ethnic minorities) have been interned in prisons or camps like the one shown here [top], which lies to the east of the city of Ürümqi, the capital of China’s Xinjiang Uyghur Autonomous Region. Another satellite image [bottom] shows the characteristic oval shape of a fixed-chimney Bull’s trench kiln, a type widely used for manufacturing bricks in southern Asia. This one is located in Pakistan’s Punjab province. This design poses environmental concerns because of the sooty air pollution it generates, and such kilns have also been associated with human-rights abuses.Top: CNES/Airbus/Google Earth; Bottom: Maxar Technologies/Google Earth

The first of these developments is not surprising. As more Earth-observation satellites are put into orbit, more images will be taken, more often. So how frequently a given area is imaged by a satellite will increase. Right now, that’s typically two or three times a week. Expect the revisit rate soon to become several times a day. This won’t entirely address the challenge of clouds obscuring what you want to view, but it will help.

The second development is more subtle. Data from the two satellites of the European Space Agency’s Sentinel-1 SAR mission, available at no cost, has enabled companies to dabble in SAR over the last few years.

With SAR, the satellite beams radio waves down and measures the return signals bouncing off the surface. It does that continually, and clever processing is used to turn that data into images. The use of radio allows these satellites to see through clouds and to collect measurements day and night. Depending on the radar band that’s employed, SAR imagery can be used to judge material properties, moisture content, precise movements, and elevation.

As more companies get familiar with such data sets, there will no doubt be a growing demand for satellite SAR imagery, which has been widely used by the military since the 1970s. But it’s just now starting to appear in commercial products. You can expect those offerings to grow dramatically, though.

Indeed, a large portion of the money being invested in this industry is currently going to fund large SAR constellations, including those of Capella Space, Iceye, Synspective, XpressSAR, and others. The market is going to get crowded fast, which is great news for customers. It means they will be able to obtain high-resolution SAR images of the place they’re interested in, taken every hour (or less), day or night, cloudy or clear.

People will no doubt figure out wonderful new ways to employ this information, so the more folks who have access to it, the better. This is something my colleagues at SkyWatch and I deeply believe, and it’s why we’ve made it our mission to help democratize access to satellite imagery.

One day in the not-so-distant future, Earth-observation satellite data might become as ubiquitous as GPS, another satellite technology first used only by the military. Imagine, for example, being able to take out your phone and say something like, “Show me this morning’s soil-moisture map for Grover’s Corners High; I want to see whether the baseball fields are still soggy.”

This article appears in the March 2022 print issue as “A Boom with a View.”

Editor's note: The original version of this article incorrectly stated that Maxar's Worldview Legion constellation launched last year.


Match ID: 15 Score: 7.86 source: spectrum.ieee.org age: 217 days
qualifiers: 5.71 air pollution, 2.14 carbon

Solar-to-Jet-Fuel System Readies for Takeoff
Wed, 03 Aug 2022 17:00:00 +0000


As climate change edges from crisis to emergency, the aviation sector looks set to miss its 2050 goal of net-zero emissions. In the five years preceding the pandemic, the top four U.S. airlines—American, Delta, Southwest, and United—saw a 15 percent increase in the use of jet fuel. Despite continual improvements in engine efficiencies, that number is projected to keep rising.

A glimmer of hope, however, comes from solar fuels. For the first time, scientists and engineers at the Swiss Federal Institute of Technology (ETH) in Zurich have reported a successful demonstration of an integrated fuel-production plant for solar kerosene. Using concentrated solar energy, they were able to produce kerosene from water vapor and carbon dioxide directly from air. Fuel thus produced is a drop-in alternative to fossil-derived fuels and can be used with existing storage and distribution infrastructures, and engines.

Fuels derived from synthesis gas (or syngas)—an intermediate product that is a specific mixture of carbon monoxide and hydrogen—is a known alternative to conventional, fossil-derived fuels. Syngas is produced by Fischer-Tropsch (FT) synthesis, in which chemical reactions convert carbon monoxide and water vapor into hydrocarbons. The team of researchers at ETH found that a solar-driven thermochemical method to split water and carbon dioxide using a metal oxide redox cycle can produce renewable syngas. They demonstrated the process in a rooftop solar refinery at the ETH Machine Laboratory in 2019.

Close-up of a spongy looking material Reticulated porous structure made of ceria used in the solar reactor to thermochemically split CO2 and H2O and produce syngas, a specific mixture of H2 and CO.ETH Zurich

The current pilot-scale solar tower plant was set up at the IMDEA Energy Institute in Spain. It scales up the solar reactor of the 2019 experiment by a factor of 10, says Aldo Steinfeld, an engineering professor at ETH who led the study. The fuel plant brings together three subsystems—the solar tower concentrating facility, solar reactor, and gas-to-liquid unit.

First, a heliostat field made of mirrors that rotate to follow the sun concentrates solar irradiation into a reactor mounted on top of the tower. The reactor is a cavity receiver lined with reticulated porous ceramic structures made of ceria (or cerium(IV) oxide). Within the reactor, the concentrated sunlight creates a high-temperature environment of about 1,500 °C which is hot enough to split captured carbon dioxide and water from the atmosphere to produce syngas. Finally, the syngas is processed to kerosene in the gas-to-liquid unit. A centralized control room operates the whole system.

Fuel produced using this method closes the fuel carbon cycle as it only produces as much carbon dioxide as has gone into its manufacture. “The present pilot fuel plant is still a demonstration facility for research purposes,” says Steinfeld, “but it is a fully integrated plant and uses a solar-tower configuration at a scale that is relevant for industrial implementation.”

“The solar reactor produced syngas with selectivity, purity, and quality suitable for FT synthesis,” the authors noted in their paper. They also reported good material stability for multiple consecutive cycles. They observed a value of 4.1 percent solar-to-syngas energy efficiency, which Steinfeld says is a record value for thermochemical fuel production, even though better efficiencies are required to make the technology economically competitive.

Schematic of the solar tower fuel plant.  A heliostat field concentrates solar radiation onto a solar reactor mounted on top of the solar tower. The solar reactor cosplits water and carbon dioxide and produces a mixture of molecular hydrogen and carbon monoxide, which in turn is processed to drop-in fuels such as kerosene.ETH Zurich

“The measured value of energy conversion efficiency was obtained without any implementation of heat recovery,” he says. The heat rejected during the redox cycle of the reactor accounted for more than 50 percent of the solar-energy input. “This fraction can be partially recovered via thermocline heat storage. Thermodynamic analyses indicate that sensible heat recovery could potentially boost the energy efficiency to values exceeding 20 percent.”

To do so, more work is needed to optimize the ceramic structures lining the reactor, something the ETH team is actively working on, by looking at 3D-printed structures for improved volumetric radiative absorption. “In addition, alternative material compositions, that is, perovskites or aluminates, may yield improved redox capacity, and consequently higher specific fuel output per mass of redox material,” Steinfeld adds.

The next challenge for the researchers, he says, is the scale-up of their technology for higher solar-radiative power inputs, possibly using an array of solar cavity-receiver modules on top of the solar tower.

To bring solar kerosene into the market, Steinfeld envisages a quota-based system. “Airlines and airports would be required to have a minimum share of sustainable aviation fuels in the total volume of jet fuel that they put in their aircraft,” he says. This is possible as solar kerosene can be mixed with fossil-based kerosene. This would start out small, as little as 1 or 2 percent, which would raise the total fuel costs at first, though minimally—adding “only a few euros to the cost of a typical flight,” as Steinfeld puts it

Meanwhile, rising quotas would lead to investment, and to falling costs, eventually replacing fossil-derived kerosene with solar kerosene. “By the time solar jet fuel reaches 10 to 15 percent of the total jet-fuel volume, we ought to see the costs for solar kerosene nearing those of fossil-derived kerosene,” he adds.

However, we may not have to wait too long for flights to operate solely on solar fuel. A commercial spin-off of Steinfeld’s laboratory, Synhelion, is working on commissioning the first industrial-scale solar fuel plant in 2023. The company has also collaborated with the airline SWISS to conduct a flight solely using its solar kerosene.


Match ID: 16 Score: 4.29 source: spectrum.ieee.org age: 52 days
qualifiers: 2.14 climate change, 2.14 carbon

Climate change: Pakistan floods 'likely' made worse by warming
Thu, 15 Sep 2022 22:41:45 GMT
Emissions from human activities played a role in the recent floods that have brought devastation to Pakistan.
Match ID: 17 Score: 2.14 source: www.bbc.co.uk age: 8 days
qualifiers: 2.14 climate change

Ensuring Underwater Robots Survive in Hot Tropical Waters
Thu, 15 Sep 2022 15:00:00 +0000


The hot, humid environment of tropical marine areas such as Australia’s Great Barrier Reef can wreak havoc on the marine autonomous systems. Underwater and surface MAS are used for marine monitoring, locating objects such as mines on the seafloor, and rescuing swimmers.

“Tropical conditions can cause systems to overheat or prevent high-density lithium batteries from recharging,” says Melanie Olsen, who is a project director of the Australian Institute of Marine Science’s (AIMS) ReefWorks, a technology testing and evaluation facility in northern Australia. “And the microbial and small creatures that thrive in these tropical environments grow rapidly on underwater surfaces and degrade the sensor performance and the hydrodynamics of the robotics and autonomous systems.”


Developing technology that can stand up to these conditions is part of Olsen’s job, as is supporting ReefWorks’ broader mission of helping others move their autonomous systems out of the lab. It’s essential to test these systems and collect compliance evidence to demonstrate they meet regulatory requirements and can be certified for operations, says Olsen, an IEEE senior member. But there are very few places to test marine robotics, autonomous systems, and artificial-intelligence (RAS-AI) technologies, which hampers the growth of the industry, Olsen says. “It’s difficult for RAS-AI vendors to progress from a prototype to a commercial product because the pathway to a certified system is complex.”

That’s why AIMS established ReefWorks. The facility is used to test crewed and uncrewed tropical and marine vessels as well as robots, sensors, and other innovations. “We are Australia’s—and possibly the world’s—first such testing facility in the tropics,” Olsen says. Examples of underwater and surface MAS include the ReefScan CoralAUV, which is used for marine monitoring, and the Wave Adaptive Modular Vessel, a surface vessel used for marine monitoring, locating mines and other objects on the seafloor, and rescuing swimmers.

AIMS has been testing equipment for over a decade, but this part of AIMS’s facilities opened to the public in December 2021. ReefWorks supports the entire development cycle, from digital-model validation and developmental testing to product and operational-level testing, Olsen says. Physical tests can be done at AIMS’s three marine field ranges, which offer different testing conditions. ReefWorks also has land-based facilities, plus the National Sea Simulator sensor test tank, and drone corridors between the at-sea ranges for verifying the performance of long-range marine autonomous systems.

“Our overall objective is to establish a sustainable marine autonomous systems [MAS] sector in Australia,” she says.

One of the ways ReefWorks helps its users make the most of their time on test ranges is to offer “digital twins” and virtual worlds. A digital twin is a virtual model of a real-world object, machine, or system that can be used to assess how the real-world counterpart is performing.

“Each of our test ranges is developing a digital twin,” Olsen says. “Developers will be able to conduct a test mission on the virtual range so when they get here, they can replay missions with real-time collected data, and validate their MAS digital-model performance.”

Olsen leads a team of five people and is currently recruiting another five. She expects the staff to triple in size in a few years as ReefWorks becomes more established in the region.

An IEEE senior member, Olsen is active with the IEEE Northern Australia Section. She served as the section chair in 2020 and 2021, during which time the section achieved the Region 10 Outstanding Small Section Award.

Integrating embedded AI and IOT edge computing

Before joining AIMS, Olsen spent a decade in Australia’s Department of Defence (DOD) as a lead engineer working on future technologies and maritime electronic-warfare systems.

Olsen grew up in a farming family and wasn’t really exposed to computers or engineers until an EE lecturer from James Cook University, in Australia, came to her rural high school to give a presentation. He brought along a remote-controlled quadrotor helicopter—a decade before quadcopters were commonplace.

The lecture led Olsen to pursue a bachelor’s degree in electrical, electronics, and computer systems, also from James Cook University, in Townsville. She went on to earn a master’s degree in systems engineering from Australia’s University of New South Wales, in Canberra. In 2016, Olsen took a job at AIMS as an engineering team leader in technology development.

“I’m very passionate about new technologies and seeing them integrated in the field,” she says. “During my decade at the [Australian] DOD, I grew my skills in systems engineering to solve more complex technology-integration challenges. AIMS offered me an opportunity to apply these skills to the challenges facing the tropical marine environment.”

“We are Australia’s—and possibly the world’s—first such testing facility in the tropics.”

There are many similarities between what Olsen had been doing at DOD and her role at ReefWorks. “My work at both DOD and AIMS requires an understanding of how electronic subsystems work, determining what’s viable for the use case, understanding the importance of modeling and simulation, and being able to communicate engineering terminology to an interdisciplinary team,” she says. “Both roles are all about engineering problem-solving.”

Olsen is currently working on integrating embedded AI and Internet of Things edge computing into AIMS infrastructure. “Artificial intelligence is used to increase a marine autonomous system’s capabilities,” she says. “For example, AI is used to train a MAS to navigate and avoid colliding with coral reefs, other vessels, or other objects or to allow the MAS to identify specific marine species, reef areas suitable for reseeding, and marine mines.”

IoT edge computing is used to process data closer to its point of origin. “This has the potential to speed up the decision process for vessels and operators while minimizing the communications and data bandwidth needed, which are key limitations when operating in marine northern Australia,” Olsen says.

Since GPS doesn’t work underwater, another of her team’s projects is looking for additional ways to conduct accurate geospatial positioning and control for missions that don’t require marine autonomous systems to come to the surface.

“We’re only just starting to get a feel for what marine autonomous systems can do—not just for our tropical marine waters but in general,” she says. “There are grand challenges no one can solve right now, like dealing with ocean pollution and the impacts of climate change.”

Robotics engineers needed

There’s nowhere near enough robotics engineers in the world, Olsen says. She recommends that engineering students take courses that include group projects.

“Group projects help you grow your ability to solve problems outside your knowledge or expertise,” she says. “They teach you how to work as an interdisciplinary team, who to ask for help, and where to find it.”

This article appears in the October 2022 print issue as “Melanie Olsen.”


Match ID: 18 Score: 2.14 source: spectrum.ieee.org age: 9 days
qualifiers: 2.14 climate change

MOXIE Shows How to Make Oxygen on Mars
Thu, 08 Sep 2022 15:27:59 +0000


Planning for the return journey is an integral part of the preparations for a crewed Mars mission. Astronauts will require a total mass of about 50 tonnes of rocket propellent for the ascent vehicle that will lift them off the planet’s surface, including 31 tonnes of oxygen approximately. The less popular option is for crewed missions to carry the required oxygen themselves. But scientists are optimistic that it could instead be produced from the carbon dioxide–rich Martian atmosphere itself, using a system called MOXIE.

The Mars Oxygen ISRU (In-Situ Resource Utilization) Experiment is an 18-kilogram unit housed within the Perseverance rover on Mars. The unit is “the size of a toaster,” adds Jeffrey Hoffman, professor of aerospace engineering at MIT. Its job is to electrochemically break down carbon dioxide collected from the Martian atmosphere into oxygen and carbon monoxide. It also tests the purity of the oxygen.

Between February 2021, when it arrived on Mars aboard the Perseverance, and the end of the year, MOXIE has had several successful test runs. According to a review of the system by Hoffman and colleagues, published in Science Advances, it has demonstrated its ability to produce oxygen during both night and day, when temperatures can vary by over 100 ºC. The generation and purity rates of oxygen also meet requirements to produce rocket propellent and for breathing. The authors assert that a scaled-up version of MOXIE could produce the required oxygen for lift-off as well as for the astronauts to breathe.

Next question: How to power any oxygen-producing factories that NASA can land on Mars? Perhaps via NASA’s Kilopower fission reactors?

MOXIE is a first step toward a much larger and more complex system to support the human exploration of Mars. The researchers estimate a required generation rate of 2 to 3 kilograms per hour, compared with the current MOXIE rate of 6 to 8 grams per hour, to produce enough oxygen for lift-off for a crew arriving 26 months later. “So we’re talking about a system that’s a couple of hundred times bigger than MOXIE,” Hoffman says.

They calculate this rate accounting for eight months to get to Mars, followed by some time to set up the system. “We figure you'd probably have maybe 14 months to make all the oxygen.” Further, he says, the produced oxygen would have to be liquefied to be used a rocket propellant, something the current version of MOXIE doesn’t do.

MOXIE also currently faces several design constraints because, says Hoffman, a former astronaut, “our only ride to Mars was inside the Perseverance rover.” This limited the amount of power available to operate the unit, the amount of heat they could produce, the volume and the mass.

“MOXIE does not work nearly as efficiently as a stand-alone system that was specifically designed would,” says Hoffman. Most of the time, it’s turned off. “Every time we want to make oxygen, we have to heat it up to 800 ºC, so most of the energy goes into heating it up and running the compressor, whereas in a well-designed stand-alone system, most of the energy will go into the actual electrolysis, into actually producing the oxygen.”

However, there are still many kinks to iron out for the scaling-up process. To begin with, any oxygen-producing system will need lots of power. Hoffman thinks nuclear power is the most likely option, maybe NASA’s Kilopower fission reactors. The setup and the cabling would certainly be challenging, he says. “You’re going to have to launch to all of these nuclear reactors, and of course, they’re not going to be in exactly the same place as the [other] units,” he says. "So, robotically, you’re going to have to connect to the electrical cables to bring power to the oxygen-producing unit.”

Then there is the solid oxide electrolysis units, which Hoffman points out are carefully machined systems. Fortunately, the company that makes them, OxEon, has already designed, built, and tested a full-scale unit, a hundred times bigger than the one on MOXIE. “Several of those units would be required to produce oxygen at the quantities that we need,” Hoffman says.

He also adds that at present, there is no redundancy built into MOXIE. If any part fails, the whole system dies. “If you’re counting on a system to produce oxygen for rocket propellant and for breathing, you need very high reliability, which means you’re going to need quite a few redundant units.”

Moreover, the system has to be pretty much autonomous, Hoffman says. “It has to be able to monitor itself, run itself.” For testing purposes, every time MOXIE is powered up, there is plenty of time to plan. A full-scale MOXIE system, though, would have to run continuously, and for that it has to be able to adjust automatically to changes in the Mars atmosphere, which can vary by a factor of two over a year, and between nighttime and daytime temperature differences.


Match ID: 19 Score: 2.14 source: spectrum.ieee.org age: 16 days
qualifiers: 2.14 carbon

How Pakistan floods are linked to climate change
Fri, 02 Sep 2022 13:42:00 GMT
Pakistan's geography - and its immense glaciers - make it vulnerable to climate change.
Match ID: 20 Score: 2.14 source: www.bbc.co.uk age: 22 days
qualifiers: 2.14 climate change

Climate change: 'Staggering' rate of global tree losses from fires
Wed, 17 Aug 2022 09:00:36 GMT
A report says around 16 football pitches of trees per minute were lost to wildfires in 2021.
Match ID: 21 Score: 2.14 source: www.bbc.co.uk age: 38 days
qualifiers: 2.14 climate change

Inside the Universe Machine: The Webb Space Telescope’s Staggering Vision
Wed, 06 Jul 2022 13:00:00 +0000


For a deep dive into the engineering behind the James Webb Space Telescope, see our collection of posts here.

“Build something that will absolutely, positively work.” This was the mandate from NASA for designing and building the James Webb Space Telescope—at 6.5 meters wide the largest space telescope in history. Last December, JWST launched famously and successfully to its observing station out beyond the moon. And now according to NASA, as soon as next week, the JWST will at long last begin releasing scientific images and data.

Mark Kahan, on JWST’s product integrity team, recalls NASA’s engineering challenge as a call to arms for a worldwide team of thousands that set out to create one of the most ambitious scientific instruments in human history. Kahan—chief electro-optical systems engineer at Mountain View, Calif.–based Synopsys—and many others in JWST’s “pit crew” (as he calls the team) drew hard lessons from three decades ago, having helped repair another world-class space telescope with a debilitating case of flawed optics. Of course the Hubble Space Telescope is in low Earth orbit, and so a special space-shuttle mission to install corrective optics ( as happened in 1993) was entirely possible.

Not so with the JWST.

The meticulous care NASA demanded of JWST’s designers is all the more a necessity because Webb is well out of reach of repair crews. Its mission is to study the infrared universe, and that requires shielding the telescope and its sensors from both the heat of sunlight and the infrared glow of Earth. A good place to do that without getting too far from Earth is an empty patch of interplanetary space 1.5 million kilometers away (well beyond the moon’s orbit) near a spot physicists call the second Lagrange point, or L2.

The pit crew’s job was “down at the detail level, error checking every critical aspect of the optical design,” says Kahan. Having learned the hard way from Hubble, the crew insisted that every measurement on Webb’s optics be made in at least two different ways that could be checked and cross-checked. Diagnostics were built into the process, Kahan says, so that “you could look at them to see what to kick” to resolve any discrepancies. Their work had to be done on the ground, but their tests had to assess how the telescope would work in deep space at cryogenic temperatures.

Three New Technologies for the Main Mirror

Superficially, Webb follows the design of all large reflecting telescopes. A big mirror collects light from stars, galaxies, nebulae, planets, comets, and other astronomical objects—and then focuses those photons onto a smaller secondary mirror that sends it to a third mirror that then ultimately directs the light to instruments that record images and spectra.

Webb’s 6.5-meter primary mirror is the first segmented mirror to be launched into space. All the optics had to be made on the ground at room temperature but were deployed in space and operated at 30 to 55 degrees above absolute zero. “We had to develop three new technologies” to make it work, says Lee D. Feinberg of the NASA Goddard Space Flight Center, the optical telescope element manager for Webb for the past 20 years.

The longest wavelengths that Hubble has to contend with were 2.5 micrometers, whereas Webb is built to observe infrared light that stretches to 28 μm in wavelength. Compared with Hubble, whose primary mirror is a circle of an area 4.5 square meters, “[Webb’s primary mirror] had to be 25 square meters,” says Feinberg. Webb also “needed segmented mirrors that were lightweight, and its mass was a huge consideration,” he adds. No single-component mirror that could provide the required resolution would have fit on the Ariane 5 rocket that launched JWST. That meant the mirror would have to be made in pieces, assembled, folded, secured to withstand the stress of launch, then unfolded and deployed in space to create a surface that was within tens of nanometers of the shape specified by the designers.

Images of the James Webb Space Telescope and Hubble Space Telescope to scale, compared to a human figure, who is dwarfed by their size The James Webb Space Telescope [left] and the Hubble Space Telescope side by side—with Hubble’s 2.4-meter-diameter mirror versus Webb’s array of hexagonal mirrors making a 6.5-meter-diameter light-collecting area. NASA Goddard Space Flight Center

NASA and the U.S. Air Force, which has its own interests in large lightweight space mirrors for surveillance and focusing laser energy, teamed up to develop the technology. The two agencies narrowed eight submitted proposals down to two approaches for building JWST’s mirrors: one based on low-expansion glass made of a mixture of silicon and titanium dioxides similar to that used in Hubble and the other the light but highly toxic metal beryllium. The most crucial issue came down to how well the materials could withstand temperature changes from room temperature on the ground to around 50 K in space. Beryllium won because it could fully release stress after cooling without changing its shape, and it’s not vulnerable to the cracking that can occur in glass. The final beryllium mirror was a 6.5-meter array of 18 hexagonal beryllium mirrors, each weighing about 20 kilograms. The weight per unit area of JWST’s mirror was only 10 percent of that in Hubble. A 100-nanometer layer of pure gold makes the surface reflect 98 percent of incident light from JWST’s main observing band of 0.6 to 28.5 μm. “Pure silver has slightly higher reflectivity than pure gold, but gold is more robust,” says Feinberg. A thin layer of amorphous silica protects the metal film from surface damage.

In addition, a wavefront-sensing control system keeps mirror segment surfaces aligned to within tens of nanometers. Built on the ground, the system is expected to keep mirror alignment stabilized throughout the telescope’s operational life. A backplane kept at a temperature of 35 K holds all 2.4 tonnes of the telescope and instruments rock-steady to within 32 nm while maintaining them at cryogenic temperatures during observations.

Metal superstructure of cages and supports stands on a giant platform in a warehouse-sized clean-room. A man in a cleanroom suit watches the operations. The JWST backplane, the “spine” that supports the entire hexagonal mirror structure and carries more than 2,400 kg of hardware, is readied for assembly to the rest of the telescope. NASA/Chris Gunn

Hubble’s amazing, long-exposure images of distant galaxies are possible through the use of gyroscopes and reaction wheels. The gyroscopes are used to sense unwanted rotations, and reaction wheels are used to counteract them.

But the gyroscopes used on Hubble have had a bad track record and have had to be replaced repeatedly. Only three of Hubble’s six gyros remain operational today, and NASA has devised plans for operating with one or two gyros at reduced capability. Hubble also includes reaction wheels and magnetic torquers, used to maintain its orientation when needed or to point at different parts of the sky.

Webb uses reaction wheels similarly to turn across the sky, but instead of using mechanical gyros to sense direction, it uses hemispherical resonator gyroscopes, which have no moving parts. Webb also has a small fine-steering mirror in the optical path, which can tilt over an angle of just 5 arc seconds. Those very fine adjustments of the light path into the instruments keep the telescope on target. “It’s a really wonderful way to go,” says Feinberg, adding that it compensates for small amounts of jitter without having to move the whole 6-tonne observatory.

Instruments

Other optics distribute light from the fine-steering mirror among four instruments, two of which can observe simultaneously. Three instruments have sensors that observe wavelengths of 0.6 to 5 μm, which astronomers call the near-infrared. The fourth, called the Mid-InfraRed Instrument (MIRI), observes what astronomers call the mid-infrared spectrum, from 5 to 28.5 μm. Different instruments are needed because sensors and optics have limited wavelength ranges. (Optical engineers may blanch slightly at astronomers’ definitions of what constitutes the near- and mid-infrared wavelength ranges. These two groups simply have differing conventions for labeling the various regimes of the infrared spectrum.)

Mid-infrared wavelengths are crucial for observing young stars and planetary systems and the earliest galaxies, but they also pose some of the biggest engineering challenges. Namely, everything on Earth and planets out to Jupiter glow in the mid-infrared. So for JWST to observe distant astronomical objects, it must avoid recording extraneous mid-infrared noise from all the various sources inside the solar system. “I have spent my whole career building instruments for wavelengths of 5 μm and longer,” says MIRI instrument scientist Alistair Glasse of the Royal Observatory, in Edinburgh. “We’re always struggling against thermal background.”

Mountaintop telescopes can see the near-infrared, but observing the mid-infrared sky requires telescopes in space. However, the thermal radiation from Earth and its atmosphere can cloud their view, and so can the telescopes themselves unless they are cooled far below room temperature. An ample supply of liquid helium and an orbit far from Earth allowed the Spitzer Space Telescope’s primary observing mission to last for five years, but once the last of the cryogenic fluid evaporated in 2009, its observations were limited to wavelengths shorter than 5 μm.

Webb has an elaborate solar shield to block sunlight, and an orbit 1.5 million km from Earth that can keep the telescope to below 55 K, but that’s not good enough for low-noise observations at wavelengths longer than 5 μm. The near-infrared instruments operate at 40 K to minimize thermal noise. But for observations out to 28.5 μm, MIRI uses a specially developed closed-cycle, helium cryocooler to keep MIRI cooled below 7 K. “We want to have sensitivity limited by the shot noise of astronomical sources,” says Glasse. (Shot noise occurs when optical or electrical signals are so feeble that each photon or electron constitutes a detectable peak.) That will make MIRI 1,000 times as sensitive in the mid-infrared as Spitzer.

Another challenge is the limited transparency of optical materials in the mid-infrared. “We use reflective optics wherever possible,” says Glasse, but they also pose problems, he adds. “Thermal contraction is a big deal,” he says, because the instrument was made at room temperature but is used at 7 K. To keep thermal changes uniform throughout MIRI, they made the whole structure of gold-coated aluminum lest other metals cause warping.

Detectors are another problem. Webb’s near-infrared sensors use mercury cadmium telluride photodetectors with a resolution of 2,048 x 2,048 pixels. This resolution is widely used at wavelengths below 5 μm, but sensing at MIRI’s longer wavelengths required exotic detectors that are limited to offering only 1,024 x 1,024 pixels.

Glasse says commissioning “has gone incredibly well.” Although some stray light has been detected, he says, “we are fully expecting to meet all our science goals.”

NIRCam Aligns the Whole Telescope

The near-infrared detectors and optical materials used for observing at wavelengths shorter than 5 μm are much more mature than those for the mid-infrared, so the Near-Infrared Camera (NIRCam) does double duty by both recording images and aligning all the optics in the whole telescope. That alignment was the trickiest part of building the instrument, says NIRCam principal investigator Marcia Rieke of the University of Arizona.

Alignment means getting all the light collected by the primary mirror to get to the right place in the final image. That’s crucial for Webb, because it has 18 separate segments that have to overlay their images perfectly in the final image, and because all those segments were built on the ground at room temperature but operate at cryogenic temperatures in space at zero gravity. When NASA recorded a test image of a single star after Webb first opened its primary mirror, it showed 18 separate bright spots, one from each segment. When alignment was completed on 11 March, the image from NIRcam showed a single star with six spikes caused by diffraction.

Image of a star with six-pointed spikes caused by diffraction Even when performing instrumental calibration tasks, JWST couldn’t help but showcase its stunning sensitivity to the infrared sky. The central star is what telescope technicians used to align JWST’s mirrors. But notice the distant galaxies and stars that photobombed the image too!NASA/STScI

Building a separate alignment system would have added to both the weight and cost of Webb, Rieke realized, and in the original 1995 plan for the telescope she proposed designing NIRCam so it could align the telescope optics once it was up in space as well as record images. “The only real compromise was that it required NIRCam to have exquisite image quality,” says Rieke, wryly. From a scientific point, she adds, using the instrument to align the telescope optics “is great because you know you’re going to have good image quality and it’s going to be aligned with you.” Alignment might be just a tiny bit off for other instruments. In the end, it took a team at Lockheed Martin to develop the computational tools to account for all the elements of thermal expansion.

Escalating costs and delays had troubled Webb for years. But for Feinberg, “commissioning has been a magical five months.” It began with the sight of sunlight hitting the mirrors. The segmented mirror deployed smoothly, and after the near-infrared cameras cooled, the mirrors focused one star into 18 spots, then aligned them to put the spots on top of each other. “Everything had to work to get it to [focus] that well,” he says. It’s been an intense time, but for Feinberg, a veteran of the Hubble repair mission, commissioning Webb was “a piece of cake.”

NASA announced that between May 23rd and 25th, one segment of the primary mirror had been dinged by a micrometeorite bigger than the agency had expected when it analyzed the potential results of such impacts. “Things do degrade over time,” Feinberg said. But he added that Webb had been engineered to minimize damage, and NASA said the event had not affected Webb’s operation schedule.

Corrections 26-28 July 2022: The story was updated a) to reflect the fact that the Lagrange point L2 where Webb now orbits is not that of the "Earth-moon system" (as the story had originally reported) but rather the Earth-sun system
and b) to correct misstatements in the original posting about Webb's hardware for controlling its orientation.

Corrections 12 Aug. 2022: Alistair Glasse's name was incorrectly spelled in a previous version of this story, as was NIRCam (which we'd spelled as NIRcam); Webb's tertiary mirror (we'd originally reported only its primary and secondary mirrors) was also called out in this version.

This article appears in the September 2022 print issue as “Inside the Universe Machine.”


Match ID: 22 Score: 2.14 source: spectrum.ieee.org age: 80 days
qualifiers: 2.14 toxic

NASA to Industry: Let’s Develop Flight Tech to Reduce Carbon Emissions
Wed, 29 Jun 2022 14:25 EDT
NASA announced Wednesday the agency is seeking partners to develop technologies needed to shape a new generation of lower-emission, single-aisle airliners that passengers could see in airports in the 2030s.
Match ID: 23 Score: 2.14 source: www.nasa.gov age: 86 days
qualifiers: 2.14 carbon

U.N. Kills Any Plans to Use Mercury as a Rocket Propellant
Tue, 19 Apr 2022 18:00:01 +0000


A recent United Nations provision has banned the use of mercury in spacecraft propellant. Although no private company has actually used mercury propellant in a launched spacecraft, the possibility was alarming enough—and the dangers extreme enough—that the ban was enacted just a few years after one U.S.-based startup began toying with the idea. Had the company gone through with its intention to sell mercury propellant thrusters to some of the companies building massive satellite constellations over the coming decade, it would have resulted in Earth’s upper atmosphere being laced with mercury.

Mercury is a neurotoxin. It’s also bio-accumulative, which means it’s absorbed by the body at a faster rate than the body can remove it. The most common way to get mercury poisoning is through eating contaminated seafood. “It’s pretty nasty,” says Michael Bender, the international coordinator of the Zero Mercury Working Group (ZMWG). “Which is why this is one of the very few instances where the governments of the world came together pretty much unanimously and ratified a treaty.”

Bender is referring to the 2013 Minamata Convention on Mercury, a U.N. treaty named for a city in Japan whose residents suffered from mercury poisoning from a nearby chemical factory for decades. Because mercury pollutants easily find their way into the oceans and the atmosphere, it’s virtually impossible for one country to prevent mercury poisoning within its borders. “Mercury—it’s an intercontinental pollutant,” Bender says. “So it required a global treaty.”

Today, the only remaining permitted uses for mercury are in fluorescent lighting and dental amalgams, and even those are being phased out. Mercury is otherwise found as a by-product of other processes, such as the burning of coal. But then a company hit on the idea to use it as a spacecraft propellant.

In 2018, an employee at Apollo Fusion approached the Public Employees for Environmental Responsibility (PEER), a nonprofit that investigates environmental misconduct in the United States. The employee—who has remained anonymous—alleged that the Mountain View, Calif.–based space startup was planning to build and sell thrusters that used mercury propellant to multiple companies building low Earth orbit (LEO) satellite constellations.

Four industry insiders ultimately confirmed that Apollo Fusion was building thrusters that utilized mercury propellant. Apollo Fusion, which was acquired by rocket manufacturing startup Astra in June 2021, insisted that the composition of its propellant mixture should be considered confidential information. The company withdrew its plans for a mercury propellant in April 2021. Astra declined to respond to a request for comment for this story.

Apollo Fusion wasn’t the first to consider using mercury as a propellant. NASA originally tested it in the 1960s and 1970s with two Space Electric Propulsion Tests (SERT), one of which was sent into orbit in 1970. Although the tests demonstrated mercury’s effectiveness as a propellant, the same concerns over the element’s toxicity that have seen it banned in many other industries halted its use by the space agency as well.

“I think it just sort of fell off a lot of folks’ radars,” says Kevin Bell, the staff counsel for PEER. “And then somebody just resurrected the research on it and said, ‘Hey, other than the environmental impact, this was a pretty good idea.’ It would give you a competitive advantage in what I imagine is a pretty tight, competitive market.”

That’s presumably why Apollo Fusion was keen on using it in their thrusters. Apollo Fusion as a startup emerged more or less simultaneously with the rise of massive LEO constellations that use hundreds or thousands of satellites in orbits below 2,000 kilometers to provide continual low-latency coverage. Finding a slightly cheaper, more efficient propellant for one large geostationary satellite doesn’t move the needle much. But doing the same for thousands of satellites that need to be replaced every several years? That’s a much more noticeable discount.

Were it not for mercury’s extreme toxicity, it would actually make an extremely attractive propellant. Apollo Fusion wanted to use a type of ion thruster called a Hall-effect thruster. Ion thrusters strip electrons from the atoms that make up a liquid or gaseous propellant, and then an electric field pushes the resultant ions away from the spacecraft, generating a modest thrust in the opposite direction. The physics of rocket engines means that the performance of these engines increases with the mass of the ion that you can accelerate.

Mercury is heavier than either xenon or krypton, the most commonly used propellants, meaning more thrust per expelled ion. It’s also liquid at room temperature, making it efficient to store and use. And it’s cheap—there’s not a lot of competition with anyone looking to buy mercury.

Bender says that ZMWG, alongside PEER, caught wind of Apollo Fusion marketing its mercury-based thrusters to at least three companies deploying LEO constellations—One Web, Planet Labs, and SpaceX. Planet Labs, an Earth-imaging company, has at least 200 CubeSats in low Earth orbit. One Web and SpaceX, both wireless-communication providers, have many more. One Web plans to have nearly 650 satellites in orbit by the end of 2022. SpaceX already has nearly 1,500 active satellites aloft in its Starlink constellation, with an eye toward deploying as many as 30,000 satellites before its constellation is complete. Other constellations, like Amazon’s Kuiper constellation, are also planning to deploy thousands of satellites.

In 2019, a group of researchers in Italy and the United States estimated how much of the mercury used in spacecraft propellant might find its way back into Earth’s atmosphere. They figured that a hypothetical LEO constellation of 2,000 satellites, each carrying 100 kilograms of propellant, would emit 20 tonnes of mercury every year over the course of a 10-year life span. Three quarters of that mercury, the researchers suggested, would eventually wind up in the oceans.

That amounts to 1 percent of global mercury emissions from a constellation only a fraction of the size of the one planned by SpaceX alone. And if multiple constellations adopted the technology, they would represent a significant percentage of global mercury emissions—especially, the researchers warned, as other uses of mercury are phased out as planned in the years ahead.

Fortunately, it’s unlikely that any mercury propellant thrusters will even get off the ground. Prior to the fourth meeting of the Minamata Convention, Canada, the European Union, and Norway highlighted the dangers of mercury propellant, alongside ZMWG. The provision to ban mercury usage in satellites was passed on 26 March 2022.

The question now is enforcement. “Obviously, there aren’t any U.N. peacekeepers going into space to shoot down” mercury-based satellites, says Bell. But the 137 countries, including the United States, who are party to the convention have pledged to adhere to its provisions—including the propellant ban.

The United States is notable in that list because as Bender explains, it did not ratify the Minamata Convention via the U.S. Senate but instead deposited with the U.N. an instrument of acceptance. In a 7 November 2013 statement (about one month after the original Minamata Convention was adopted), the U.S. State Department said the country would be able to fulfill its obligations “under existing legislative and regulatory authority.”

Bender says the difference is “weedy” but that this appears to mean that the U.S. government has agreed to adhere to the Minamata Convention’s provisions because it already has similar laws on the books. Except there is still no existing U.S. law or regulation banning mercury propellant. For Bender, that creates some uncertainty around compliance when the provision goes into force in 2025.

Still, with a U.S. company being the first startup to toy with mercury propellant, it might be ideal to have a stronger U.S. ratification of the Minamata Convention before another company hits on the same idea. “There will always be market incentives to cut corners and do something more dangerously,” Bell says.

Update 19 April 2022: In an email, a spokesperson for Astra stated that the company's propulsion system, the Astra Spacecraft Engine, does not use mercury. The spokesperson also stated that Astra has no plans to use mercury propellant and that the company does not have anything in orbit that uses mercury.

Updated 20 April 2022 to clarify that Apollo Fusion was building thrusters that used mercury, not that they had actually used them.


Match ID: 24 Score: 2.14 source: spectrum.ieee.org age: 158 days
qualifiers: 2.14 toxic

Ahrefs vs SEMrush: Which SEO Tool Should You Use?
Tue, 01 Mar 2022 12:16:00 +0000
semrush vs ahrefs


SEMrush and Ahrefs are among the most popular tools in the SEO industry. Both companies have been in business for years and have thousands of customers per month.

If you're a professional SEO or trying to do digital marketing on your own, at some point you'll likely consider using a tool to help with your efforts. Ahrefs and SEMrush are two names that will likely appear on your shortlist.

In this guide, I'm going to help you learn more about these SEO tools and how to choose the one that's best for your purposes.

What is SEMrush?

semrush

SEMrush is a popular SEO tool with a wide range of features—it's the leading competitor research service for online marketers. SEMrush's SEO Keyword Magic tool offers over 20 billion Google-approved keywords, which are constantly updated and it's the largest keyword database.

The program was developed in 2007 as SeoQuake is a small Firefox extension

Features

  • Most accurate keyword data: Accurate keyword search volume data is crucial for SEO and PPC campaigns by allowing you to identify what keywords are most likely to bring in big sales from ad clicks. SEMrush constantly updates its databases and provides the most accurate data.
  • Largest Keyword database: SEMrush's Keyword Magic Tool now features 20-billion keywords, providing marketers and SEO professionals the largest database of keywords.

  • All SEMrush users receive daily ranking data, mobile volume information, and the option to buy additional keywords by default with no additional payment or add-ons needed
  • Most accurate position tracking tool: This tool provides all subscribers with basic tracking capabilities, making it suitable for SEO professionals. Plus, the Position Tracking tool provides local-level data to everyone who uses the tool.
  • SEO Data Management: SEMrush makes managing your online data easy by allowing you to create visually appealing custom PDF reports, including Branded and White Label reports, report scheduling, and integration with GA, GMB, and GSC.
  • Toxic link monitoring and penalty recovery: With SEMrush, you can make a detailed analysis of toxic backlinks, toxic scores, toxic markers, and outreach to those sites.
  • Content Optimization and Creation Tools: SEMrush offers content optimization and creation tools that let you create SEO-friendly content. Some features include the SEO Writing Assistant, On-Page SEO Check, er/SEO Content Template, Content Audit, Post Tracking, Brand Monitoring.

Ahrefs

ahrefs


Ahrefs is a leading SEO platform that offers a set of tools to grow your search traffic, research your competitors, and monitor your niche. The company was founded in 2010, and it has become a popular choice among SEO tools. Ahrefs has a keyword index of over 10.3 billion keywords and offers accurate and extensive backlink data updated every 15-30 minutes and it is the world's most extensive backlink index database.

Features

  • Backlink alerts data and new keywords: Get an alert when your site is linked to or discussed in blogs, forums, comments, or when new keywords are added to a blog posting about you.
  • Intuitive interface: The intuitive design of the widget helps you see the overall health of your website and search engine ranking at a glance.
  • Site Explorer: The Site Explorer will give you an in-depth look at your site's search traffic.
  • Domain Comparison
  • Reports with charts and graphs
  • JavaScript rendering and a site audit can identify SEO issues.
  • A question explorer that provides well-crafted topic suggestions

Direct Comparisons: Ahrefs vs SEMrush

Now that you know a little more about each tool, let's take a look at how they compare. I'll analyze each tool to see how they differ in interfaces, keyword research resources, rank tracking, and competitor analysis.

User Interface

Ahrefs and SEMrush both offer comprehensive information and quick metrics regarding your website's SEO performance. However, Ahrefs takes a bit more of a hands-on approach to getting your account fully set up, whereas SEMrush's simpler dashboard can give you access to the data you need quickly.

In this section, we provide a brief overview of the elements found on each dashboard and highlight the ease with which you can complete tasks.

AHREFS

ahrefs interface


The Ahrefs dashboard is less cluttered than that of SEMrush, and its primary menu is at the very top of the page, with a search bar designed only for entering URLs.

Additional features of the Ahrefs platform include:

  • You can see analytics from the dashboard, including search engine rankings to domain ratings, referring domains, and backlink
  • Jumping from one tool to another is easy. You can use the Keyword Explorer to find a keyword to target and then directly track your ranking with one click.
  • The website offers a tooltip helper tool that allows you to hover your mouse over something that isn't clear and get an in-depth explanation.

SEMRUSH

semrush domain overview


When you log into the SEMrush Tool, you will find four main modules. These include information about your domains, organic keyword analysis, ad keyword, and site traffic.

You'll also find some other options like

  • A search bar allows you to enter a domain, keyword, or anything else you wish to explore.
  • A menu on the left side of the page provides quick links to relevant information, including marketing insights, projects, keyword analytics, and more.
  • The customer support resources located directly within the dashboard can be used to communicate with the support team or to learn about other resources such as webinars and blogs.
  • Detailed descriptions of every resource offered. This detail is beneficial for new marketers, who are just starting.

WHO WINS?

Both Ahrefs and SEMrush have user-friendly dashboards, but Ahrefs is less cluttered and easier to navigate. On the other hand, SEMrush offers dozens of extra tools, including access to customer support resources.

When deciding on which dashboard to use, consider what you value in the user interface, and test out both.

Rank Tracking

If you're looking to track your website's search engine ranking, rank tracking features can help. You can also use them to monitor your competitors.

Let's take a look at Ahrefs vs. SEMrush to see which tool does a better job.

Ahrefs

ahrefs rank tracking


The Ahrefs Rank Tracker is simpler to use. Just type in the domain name and keywords you want to analyze, and it spits out a report showing you the search engine results page (SERP) ranking for each keyword you enter.

Rank Tracker looks at the ranking performance of keywords and compares them with the top rankings for those keywords. Ahrefs also offers:

You'll see metrics that help you understand your visibility, traffic, average position, and keyword difficulty.

It gives you an idea of whether a keyword would be profitable to target or not.

SEMRUSH

semrush position tracking


SEMRush offers a tool called Position Tracking. This tool is a project tool—you must set it up as a new project. Below are a few of the most popular features of the SEMrush Position Tracking tool:

All subscribers are given regular data updates and mobile search rankings upon subscribing

The platform provides opportunities to track several SERP features, including Local tracking.

Intuitive reports allow you to track statistics for the pages on your website, as well as the keywords used in those pages.

Identify pages that may be competing with each other using the Cannibalization report.

WHO WINS?

Ahrefs is a more user-friendly option. It takes seconds to enter a domain name and keywords. From there, you can quickly decide whether to proceed with that keyword or figure out how to rank better for other keywords.

SEMrush allows you to check your mobile rankings and ranking updates daily, which is something Ahrefs does not offer. SEMrush also offers social media rankings, a tool you won't find within the Ahrefs platform. Both are good which one do you like let me know in the comment.

Keyword Research

Keyword research is closely related to rank tracking, but it's used for deciding which keywords you plan on using for future content rather than those you use now.

When it comes to SEO, keyword research is the most important thing to consider when comparing the two platforms.

AHREFS



The Ahrefs Keyword Explorer provides you with thousands of keyword ideas and filters search results based on the chosen search engine.

Ahrefs supports several features, including:

  • It can search multiple keywords in a single search and analyze them together. At SEMrush, you also have this feature in Keyword Overview.
  • Ahrefs has a variety of keywords for different search engines, including Google, YouTube, Amazon, Bing, Yahoo, Yandex, and other search engines.
  • When you click on a keyword, you can see its search volume and keyword difficulty, but also other keywords related to it, which you didn't use.

SEMRUSH



SEMrush's Keyword Magic Tool has over 20 billion keywords for Google. You can type in any keyword you want, and a list of suggested keywords will appear.

The Keyword Magic Tool also lets you to:

  • Show performance metrics by keyword
  • Search results are based on both broad and exact keyword matches.
  • Show data like search volume, trends, keyword difficulty, and CPC.
  • Show the first 100 Google search results for any keyword.
  • Identify SERP Features and Questions related to each keyword
  • SEMrush has released a new Keyword Gap Tool that uncovers potentially useful keyword opportunities for you, including both paid and organic keywords.

WHO WINS?

Both of these tools offer keyword research features and allow users to break down complicated tasks into something that can be understood by beginners and advanced users alike.

If you're interested in keyword suggestions, SEMrush appears to have more keyword suggestions than Ahrefs does. It also continues to add new features, like the Keyword Gap tool and SERP Questions recommendations.

Competitor Analysis

Both platforms offer competitor analysis tools, eliminating the need to come up with keywords off the top of your head. Each tool is useful for finding keywords that will be useful for your competition so you know they will be valuable to you.

AHREFS



Ahrefs' domain comparison tool lets you compare up to five websites (your website and four competitors) side-by-side.it also shows you how your site is ranked against others with metrics such as backlinks, domain ratings, and more.

Use the Competing Domains section to see a list of your most direct competitors, and explore how many keywords matches your competitors have.

To find more information about your competitor, you can look at the Site Explorer and Content Explorer tools and type in their URL instead of yours.

SEMRUSH



SEMrush provides a variety of insights into your competitors' marketing tactics. The platform enables you to research your competitors effectively. It also offers several resources for competitor analysis including:

Traffic Analytics helps you identify where your audience comes from, how they engage with your site, what devices visitors use to view your site, and how your audiences overlap with other websites.

SEMrush's Organic Research examines your website's major competitors and shows their organic search rankings, keywords they are ranking for, and even if they are ranking for any (SERP) features and more.

The Market Explorer search field allows you to type in a domain and lists websites or articles similar to what you entered. Market Explorer also allows users to perform in-depth data analytics on These companies and markets.

WHO WINS?

SEMrush wins here because it has more tools dedicated to competitor analysis than Ahrefs. However, Ahrefs offers a lot of functionality in this area, too. It takes a combination of both tools to gain an advantage over your competition.

Pricing

Ahrefs

  • Lite Monthly: $99/month
  • Standard Monthly: $179/month
  • Annually Lite: $990/year
  • Annually Standard: $1790/year

SEMRUSH

  • Pro Plan: $119.95/month
  • Guru Plan:$229.95/month
  • Business Plan: $449.95/month

Which SEO tool should you choose for digital marketing?

When it comes to keyword data research, you will become confused about which one to choose.

Consider choosing Ahrefs if you

  • Like friendly and clean interface
  • Searching for simple keyword suggestions

  • Want to get more keywords for different search engines like Amazon, Bing, Yahoo, Yandex, Baidu, and more

 

Consider SEMrush if you:

  • Want more marketing and SEO features
  • Need competitor analysis tool
  • Need to keep your backlinks profile clean
  • Looking for more keyword suggestions for Google

Both tools are great. Choose the one which meets your requirements and if you have any experience using either Ahrefs or SEMrush let me know in the comment section which works well for you.

 

 


Match ID: 25 Score: 2.14 source: www.crunchhype.com age: 207 days
qualifiers: 2.14 toxic

Filter efficiency 96.615 (26 matches/768 results)


********** UNIVERSITY **********
return to top



Forget Oxbridge: St Andrews knocks top universities off perch
Sat, 24 Sep 2022 07:00:07 GMT

Latest Guardian University Guide shows leading trio are in league of their own for undergraduate courses

Oxbridge is being replaced at the apex of UK universities by “Stoxbridge” after St Andrews overtook Oxford and Cambridge at the top of the latest Guardian University Guide.

It is the first time the Fife university has been ranked highest in the Guardian’s annual guide to undergraduate courses, pushing Oxford into second and Cambridge into third.

Continue reading...
Match ID: 0 Score: 30.00 source: www.theguardian.com age: 0 days
qualifiers: 30.00 rankings

The Guardian University Guide 2023 – the rankings
Sat, 24 Sep 2022 06:59:07 GMT

Find a course at one of the top universities in the country. Our league tables rank them all subject by subject, as well as by student satisfaction, staff numbers, spending and career prospects

Continue reading...
Match ID: 1 Score: 30.00 source: www.theguardian.com age: 0 days
qualifiers: 30.00 rankings

Video Friday: Loona
Fri, 16 Sep 2022 18:19:52 +0000


Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

IROS 2022: 23–27 October 2022, KYOTO, JAPAN
ANA Avatar XPRIZE Finals: 4–5 November 2022, LOS ANGELES
CoRL 2022: 14–18 December 2022, AUCKLAND, NEW ZEALAND

Enjoy today's videos!


Another robotic pet on Kickstarter, another bunting of red flags.

Let's see, we've got: "she's so playful and affectionate you'll forget she's a robot." "Everything you can dream of in a best friend and more." "Get ready to fall in love!" And that's literally like the first couple of tiles on the Kickstarter post. Look, the hardware seems fine, and there is a lot of expressiveness going on, I just wish they didn't set you up for an inevitable disappointment when after a couple of weeks it becomes apparent that yes, this is just a robotic toy, and will never be your best friend (or more).

Loona is currently on Kickstarter for about USD $300.

[ Kickstarter ]

Inspired by the flexibility and resilience of dragonfly wings, we propose a novel design for a biomimetic drone propeller called Tombo propeller. Here, we report on the design and fabrication process of this biomimetic propeller that can accommodate collisions and recover quickly, while maintaining sufficient thrust force to hover and fly.

[ JAIST ]

Thanks Van!

Meet Tom, a software engineer at Boston Dynamics, as he shares insights on programming and testing the practical—and impractical—applications of robotics. Whether Spot is conducting inspections or playing an instrument, learn how we go from code on a computer to actions in the real world.

Yeah, but where do I get that awesome shirt?!

[ Boston Dynamics ]

This Ameca demo couples automated speech recognition with GPT 3 —a large language model that generates meaningful answers—the output is fed to an online TTS service which generates the voice and visemes for lip sync timing. The team at Engineered Arts Ltd. pose the questions.

"Meaningful answers."

[ Engineered Arts ]

The ANT project develops a navigation and motion control system for future walking systems for planetary exploration. After successful testing on ramps and rubble fields, the challenge of climbing rough inclines such as craters is being tackled.

[ DFKI ]

Look, if you’re going to crate-train Spot, at least put some blankets and stuffed animals in there or something.

[ Energy Robotics ]

With multitrade layout, all trades’ layouts are set down with a single pass over the floor by Dusty's FieldPrinter system. Trades experience unparalleled clarity and communication with each other, because they can see each others’ installation plans and immediately identify and resolve conflicts. Instead of fighting over the floor and pointing fingers, they start to solve problems together.

[ Dusty Robotics ]

We present QUaRTM—a novel quadcopter design capable of tilting the propellers into the forward flight direction, which reduces the drag area and therefore allows for faster, more agile, and more efficient flight.

[ HiPeRLab ]

Is there an option in the iRobot app to turn my Roomba into a cake? Because I want cake.

[ iRobot ]

Looks like SoftBank is getting into high-density robotic logistics.

[ Impress ]

GITAI S2 ground test for space debris removal. During this demonstration, a tool changer was also tested to perform several different tasks at OSAM.

[ GITAI ]

Recent advances allow for the automation of food preparation in high-throughput environments, yet the successful deployment of these robots requires the planning and execution of quick, robust, and ultimately collision-free behaviors. In this work, we showcase a novel framework for modifying previously generated trajectories of robotic manipulators in highly detailed and dynamic collision environments.

[ Paper ]

The LCT Hospital in South Korea uses “Dr. LCT” for robotic-based orthopedic knee procedures. The system is based on the KUKA LBR Med robotic platform, which is ideally suited for orthopedic surgery with its seven axes, software developed specifically for medical technology, and appropriate safety measures.

[ Kuka ]

A year in review. Compilation of 2022 video highlights of the Game Changing Development (GCD) Program. The Game Changing Development Program is a part of NASA’s Space Technology Mission Directorate. The program advances space technologies that may lead to entirely new approaches for the agency’s future space missions and provide solutions to significant national needs.

[ NASA ]

Naomi Wu reviews a Diablo mobile robot (with some really cool customizations of her own), sending it out to run errands in Shenzhen during lockdown.

[ Naomi Wu ]

Roundtable discussion on how teaching automation in schools, colleges, and universities can help shape the workers of tomorrow. ABB Robotics has put together a panel of experts in this field to discuss the challenges and opportunities.

[ ABB ]

On 8 September 2022, Mario Santillo of Ford talked to robotics students as the first speaker in the Undergraduate Robotics Pathways & Careers Speaker Series, which aims to answer the question “What can I do with a robotics degree?”

[ Michigan Robotics ]


Match ID: 2 Score: 2.86 source: spectrum.ieee.org age: 7 days
qualifiers: 2.86 school

Filter efficiency 99.609 (3 matches/768 results)


********** TRAVEL **********
return to top



‘The inferno was racing towards me’: survivors of the Summerland fire on the day their holiday paradise burned down
Sat, 24 Sep 2022 09:00:01 GMT

When it opened in 1971, the acrylic-clad complex promised balmy conditions year round. But then a blaze ripped through the building in minutes, killing 50. What happened – and why has the disaster been forgotten?

Heather Lea wasn’t there when it happened; she was 19 and newly married. But her sister June was only 13 and had been looking forward to the family’s annual fortnight on the Isle of Man. Reg, Heather’s husband, drove June and her parents to the ferry terminal in his car. Heather remembers her mother turning to wave goodbye.

The Cheetham family lived in a council house on the edge of Kirkby, Liverpool. Heather shared a bedroom with June and her older sister, Mavis. Their father, Richard, a typesetter in the printing business, was an ex-sergeant in the RAF who insisted shoes were polished on a Sunday night. Her mother, Elizabeth, was timid, with a mischievous side.

Continue reading...
Match ID: 0 Score: 35.00 source: www.theguardian.com age: 0 days
qualifiers: 35.00 travel(|ing)

Five-year-old boy dies after car washed away in flood waters in NSW’s central west
Sat, 24 Sep 2022 06:44:10 GMT

Emergency services rescued four people clinging to trees after two vehicles became trapped in floods

A five-year-old boy has died after the vehicle he was travelling in was washed away in flood waters in New South Wales’ central west.

Two vehicles became trapped in flood waters on the McGrane Way at Tullamore, north-west of Parkes on Friday night.

Sign up to receive an email with the top stories from Guardian Australia every morning

Continue reading...
Match ID: 1 Score: 35.00 source: www.theguardian.com age: 0 days
qualifiers: 35.00 travel(|ing)

Video Friday: Humans Helping Robots
Fri, 23 Sep 2022 18:05:01 +0000


Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

IROS 2022: 23–27 October 2022, KYOTO, JAPAN
ANA Avatar XPRIZE Finals: 4–5 November 2022, LOS ANGELES
CoRL 2022: 14–18 December 2022, AUCKLAND, NEW ZEALAND

Enjoy today’s videos!


Until robots achieve 100 percent autonomy (HA), humans are going to need to step in from time to time, and Contoro is developing a system for intuitive, remote human intervention.

[ Contoro ]

Thanks, Youngmok!

A one year update of our ongoing project with Ontario Power Generation (OPG) and RMUS Canada to investigate the capabilities of Boston Dynamics’ Spot robot for autonomous inspection and first response in the power sector. Highlights of the first year of the project, featuring the work of Ph.D. student Christopher Baird, include autonomous elevator riding and autonomous door opening (including proxy card access doors) as part of Autowalks, as well as autonomous firefighting.

[ MARS Lab ]

Teams involved in DARPA’s Robotic Autonomy in Complex Environments with Resiliency (RACER) program have one experiment under their belts and will focus on even more difficult off-road landscapes at Camp Roberts, California, September 15–27. The program aims to give driverless combat vehicles off-road autonomy while traveling at speeds that keep pace with those driven by people in realistic situations.

[ DARPA ]

Tool use has long been a hallmark of human intelligence, as well as a practical problem to solve for a vast array of robotic applications. But machines are still wonky at exerting just the right amount of force to control tools that aren’t rigidly attached to their hands. To manipulate said tools more robustly, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), in collaboration with the Toyota Research Institute (TRI), have designed a system that can grasp tools and apply the appropriate amount of force for a given task, like squeegeeing up liquid or writing out a word with a pen.

[ MIT ]

Cornell researchers installed electronic “brains” on solar-powered robots that are 100 to 250 micrometers in size, so the tiny bots can walk autonomously without being externally controlled.

[ Cornell ]

Researchers at the University of California, San Diego, have developed soft devices containing algae that glow in the dark when experiencing mechanical stress, such as being squished, stretched, twisted or bent. The devices do not require any electronics to light up, making them an ideal choice for building soft robots that explore the deep sea and other dark environments, researchers said.

[ UCSD ]

Thanks, Liezel!

Our robotaxi is built to withstand a range of temperatures to ensure that the vehicle, and most importantly, its riders are never too hot or too cold...no matter the weather. Learn more about our thermal testing in the latest episode of Putting Zoox to the Test.

[ Zoox ]

Thanks, Whitney!

Skydio drones will do an excellent job of keeping you in frame, whatever happens.

[ Skydio ]

With the accelerated urbanization in the world, the development and utilization of the underground space are important for economic and social development and the survival of people’s lives is important for all of us. Zhejiang University Huzhou Research Institute convened a robot team to conduct an underground space unknown environment exploration adventure in Yellow dragon cave. DEEP Robotics participate in this fascinated robot party and try out the underground challenges, also team up with the drone team (air-ground robot) to seek new collaboration.

[ Deep Robotics ]

The title of this video is “Ion Propulsion Drone Proves Its Commercial Viability,” but it seems like quite a leap from a 4.5-minute flight to reaching the 15-minute flight with a significant payload that would be required for last-mile delivery.

[ Undefined Technologies ]

Welcome to this week’s edition of “How much stuff can you cram onto a Husky?”

[ Clearpath ]

In the Nanocopter AI challenge the teams demonstrated the AI they developed for Bitcraze AB’s Crazyflie nanocopters to perform vision-based obstacle avoidance at increasing speeds. The drones flew around in our “Cyberzoo,” avoiding a range of obstacles, from walls to poles and artificial plants. The drones were primarily scored on the distance they covered in the limited time but could gain extra points when flying also through gates.

[ IMAV ]

Watch this drone deliver six eggs to an empty field!

Sorry, I shouldn’t be so snarky, but I’m still not sold on the whole urban drone delivery of groceries thing.

[ Wing ]

Flexiv is pleased to announce the launch of its ROS 2 driver to bring a better robot development experience for customers.

[ Flexiv ]

Northrop Grumman has been pioneering new capabilities in the undersea domain for more than 50 years. Manta Ray, a new unmanned underwater vehicle, taking its name from the massive “winged” fish, will need to be able to operate on long-duration, long-range missions in ocean environments without need for on-site human logistics support—a unique but important mission needed to address the complex nature of undersea warfare.

[ Northrop Grumman ]

Some unique footage from drones that aren’t scared of getting a little wet.

[ Blastr ]

People tend to overtrust sophisticated computing devices, especially those powered by AI. As these systems become more fully interactive with humans during the performance of day-to-day activities, ethical considerations in deploying these systems must be more carefully investigated. In this talk, we will discuss various forms of human overtrust with respect to these intelligent machines and possible ways to mitigate the impact of bias in our interactions with them.

[ Columbia ]

The Jet Propulsion Laboratory’s success in landing the low-cost Mars Pathfinder mission in 1997 was viewed as proof that spacecraft could be built more often and for far less money—a radical cultural change NASA termed “Faster, Better, Cheaper.” The next challenge taken on by JPL was to fly two missions to Mars for the price of the single Pathfinder mission. Mars Climate Orbiter and the Mars Polar Lander both made it to the launchpad, on time and on budget, but were lost upon arrival at Mars, resulting in one of the most difficult periods in the history of JPL. “The Breaking Point” tells the story of the demise of these two missions and the abrupt end of NASA’s “Faster, Better, Cheaper” era.

[ JPL ]


Match ID: 2 Score: 35.00 source: spectrum.ieee.org age: 0 days
qualifiers: 35.00 travel(|ing)

Ghent is Belgium’s unsung capital of cool
Fri, 23 Sep 2022 10:00:54 EDT
Ghent, Belgium, has never gotten the love of Antwerp or Bruges. Here’s why the increasingly green destination should.
Match ID: 3 Score: 35.00 source: www.washingtonpost.com age: 1 day
qualifiers: 35.00 travel(|ing)

In Morocco, a new take on working vacations
Fri, 23 Sep 2022 08:00:17 EDT
I wanted to pitch in during the labor shortage. So, I rolled up my sleeves and served breakfast and checked in guests at a hostel in Morocco.
Match ID: 4 Score: 35.00 source: www.washingtonpost.com age: 1 day
qualifiers: 35.00 travel(|ing)

South Korea president criticised over gaffes at Queen’s funeral and UN
Fri, 23 Sep 2022 09:18:13 GMT

Yoon Suk-yeol accused of discourtesy in London and of swearing after chat to Joe Biden

South Korea’s president has been accused of causing a “diplomatic disaster” after his first major international trip, to the Queen’s funeral and the UN general assembly, was marred by alleged discourtesy and an expletive directed at members of the US congress.

Yoon Suk-yeol, a conservative who was already battling low approval ratings only months after taking office, drew criticism from across the South Korean political spectrum after he failed to attend the Queen’s lying in state despite traveling to London.

Continue reading...
Match ID: 5 Score: 35.00 source: www.theguardian.com age: 1 day
qualifiers: 35.00 travel(|ing)

Fact Check: Rep. Rashida Tlaib Said Progressives Must Oppose Israeli Apartheid
Thu, 22 Sep 2022 16:42:08 +0000

Recent claims that Tlaib insisted progressives must reject Israel’s right to exist have been examined and found to be misinformation.

The post Fact Check: Rep. Rashida Tlaib Said Progressives Must Oppose Israeli Apartheid appeared first on The Intercept.


Match ID: 6 Score: 35.00 source: theintercept.com age: 2 days
qualifiers: 35.00 travel(|ing)

How to plan — and survive — a trip with friends
Thu, 22 Sep 2022 12:00:42 EDT
Traveling with friends can be a fun experience that deepens bonds. Here's how to make a trip happen — and emerge with friendships intact.
Match ID: 7 Score: 35.00 source: www.washingtonpost.com age: 2 days
qualifiers: 35.00 travel(|ing)

Circle of Circuits
Wed, 21 Sep 2022 15:00:00 +0000


The Big Picture features technology through the lens of photographers.

Every month, IEEE Spectrum selects the most stunning technology images recently captured by photographers around the world. We choose images that reflect an important advance, or a trend, or that are just mesmerizing to look at. We feature all images on our site, and one also appears on our monthly print edition.

Enjoy the latest images, and if you have suggestions, leave a comment below.

RoboCup Class Picture


Humanoid robots and their human handlers gather in a circle for a group photo.

Have you ever been awed by the pageantry of the parade of nations in the opening ceremony of the Olympic Games? Then this photo, featuring more than 100 Nao programmable educational robots, two Pepper humanoid assistive robots, and their human handlers, should leave you similarly amazed. It was taken at the end of this year’s RoboCup 2022 in Bangkok. After two years during which the RoboCup was scuttled by the global pandemic, the organizers were able to bring together 13 robot teams from around the world (with three teams joining in remotely) to participate in the automaton games. The spirit of the gathering was captured in this image, which, according to RoboCup organizers, shows robots with a combined market value of roughly US $1 million.

Patrick Göttsch and Thomas Reinhardt


A satellite containing a cellular base station is shown in orbit.

Longest-Distance Calls

When you’re traveling to faraway destinations, it’s comforting to know that you can remain in contact with the folks back home, no matter how far you roam. Still, it’s surprisingly easy to end up somewhere that has poor cellular reception or none to speak of. That’s because only about 10 percent of the world’s surface is in cellular coverage zones. But in April 2022, a company called Lynk launched Lynk Tower 1, poised to be the world’s first commercial satellite cell tower, into space. The cell tower, pictured here, is said to be the first of four that Lynk plans to launch into orbit this year. Once they’re in place and contracts with terrestrial cellular service providers are set up, the 4 billion people who hardly ever have adequate cellular reception will finally be able to respond in the plural when asked “How many bars you got?”

Lynk Global


Steam rises from a set of specially arranged tubes that yield a cooling effect.

Self-Made Manufacturing

What’s more meta than using a 3D printer to make parts for a 3D printer? This device looks like a bunch of separate tubes packaged together. But it is actually a single unit that was built that way inside a 3D printer. It is a precision-engineered heat exchanger—optimized to improve the cooling of shielding gas that keeps impurities from fouling the additive manufacturing process that occurs inside an industrial 3D printer. No paper jams here.

Hyperganic


White plastic rectangular patch with four buttons is stuck on the back of a person\u2019s hand. Beads of water on the hand run off the waterproof patch.

None the Worse for Wearable

How are we to benefit from the physical and cognitive enhancements that electronic wearables could someday provide if everyday aspects of human life such as breaking a sweat are hazardous to these devices? Not to worry. In a recent paper, a research team at the University of California, Los Angeles, reported that it has the problem licked. They developed a human-machine interface that is impervious to moisture. And, as if being waterproof weren’t enough, the four-button device has been engineered to generate enough electric current to power its own operation when any of the buttons is pressed. So, it can go just about anywhere we go, with no concerns about spills, splashes, sweat, or spent batteries.

JUN CHEN RESEARCH GROUP/UCLA


Match ID: 8 Score: 30.00 source: spectrum.ieee.org age: 3 days
qualifiers: 30.00 travel(|ing)

No Way Home, Episode Four: Getting Out Alive
Wed, 21 Sep 2022 10:00:37 +0000

After a year in Taliban-controlled Afghanistan, one family gets an unexpected chance to leave.

The post No Way Home, Episode Four: Getting Out Alive appeared first on The Intercept.


Match ID: 9 Score: 30.00 source: theintercept.com age: 3 days
qualifiers: 30.00 travel(|ing)

Israeli Forces Deliberately Killed Palestinian American Journalist, Report Shows
Tue, 20 Sep 2022 17:11:53 +0000

A new forensic analysis proves that an Israeli sniper could see that Shireen Abu Akleh was a journalist before firing the bullet that killed her.

The post Israeli Forces Deliberately Killed Palestinian American Journalist, Report Shows appeared first on The Intercept.


Match ID: 10 Score: 25.00 source: theintercept.com age: 4 days
qualifiers: 25.00 travel(|ing)

Large-Scale Collection of Cell Phone Data at US Borders
2022-09-19T11:07:38Z

The Washington Post is reporting that the US Customs and Border Protection agency is seizing and copying cell phone, tablet, and computer data from “as many as” 10,000 phones per year, including an unspecified number of American citizens. This is done without a warrant, because “…courts have long granted an exception to border authorities, allowing them to search people’s devices without a warrant or suspicion of a crime.”

CBP’s inspection of people’s phones, laptops, tablets and other electronic devices as they enter the country has long been a controversial practice that the agency has defended as a low-impact way to pursue possible security threats and determine an individual’s “intentions upon entry” into the U.S. But the revelation that thousands of agents have access to a searchable database without public oversight is a new development in what privacy advocates and some lawmakers warn could be an infringement of Americans’ Fourth Amendment rights against unreasonable searches and seizures...


Match ID: 11 Score: 20.00 source: www.schneier.com age: 5 days
qualifiers: 20.00 travel(|ing)

Take a Trip Through Switzerland’s Museum of Consumer Electronics
Fri, 16 Sep 2022 18:00:00 +0000


For more than a decade Museum ENTER, in Solothurn, Switzerland, has been a place where history buffs can explore and learn about the development and growth of computer and consumer electronics in Switzerland and the rest of the world. On display are computers, calculators, floppy disks, phonographs, radios, video game consoles, and related objects.

Thanks to a new four-year partnership between the museum and the IEEE Switzerland Section, IEEE members may visit the facility for free. They also can donate their time to help create exhibits; translate pamphlets, display cards, and other written media; and present science, technology, engineering, and math workshops.

room full of historical audio radios The technology on display includes televisions and radios from the 1950s.ENTER Museum

Collections of calculators, radios, telephones, and televisions

ENTER started as the private collection of Swiss entrepreneur Felix Kunz, who had been amassing computers and other electronics since the mid-1970s. Kunz and Peter Regenass—a collector of calculators—opened the museum in 2011 near the Solothurn train station.

The museum’s collection focuses on the history of technology made in Switzerland by companies including Bolex, Crypto AG, and Gretag. The technology on display includes early telegraphs, telephones, televisions, and radios.

There are 300 mechanical calculators from Regenass’s collection. One of the mechanical calculators, Curta, looks like a pepper mill and has more than 700 parts.

The museum also has several Volksempfängers, the early radio models used by the Nazis to spread propaganda.

Visitors can check out the collection of working Apple computers, which the museum claims is the largest in Europe.

Free admission, discounts, and STEM education courses

The IEEE Switzerland Section began its partnership with the museum last year, when the student branch at the IEEE EPFL hosted a presentation there, says IEEE Senior Member Mathieu Coustans, the Switzerland Section’s treasurer.

In May, the section and the museum organized a workshop celebrating 100 years of radio broadcasting in Switzerland. IEEE members presented on the topic in French, Coustans says, and then translated the presentations to English.

Based on the success of both events, he says, the section and the museum began to discuss how else they could collaborate.

The two organizations discovered they have “many of the same goals,” says IEEE Member Violetta Vitacca, chief executive of the museum. They both aim to inspire the next generation of engineers, promote the history of technology, and bring together engineers from academia and industry to collaborate. The section and museum decided to create a long-term partnership to help each other succeed.

In addition to the free visits, IEEE members receive a 10 percent discount on services offered by the museum, including digitizing books and other materials and repairing broken equipment such as radios and vintage record players. Members can donate historical artifacts too. In addition, IEEE groups are welcome to host conferences and section meetings at the facility.

The IEEE Switzerland Section as well as members of student branches and the local IEEE Life Members Affinity Group have agreed to speak at events held at the museum and teach STEM classes there.

“The museum is a space where both professional engineers and young people can network and learn from each other,” Vitacca says. “I think this partnership is a win-win for both IEEE and the museum.”

She says she hopes that “collaborating with IEEE will help Museum ENTER gain an international reputation.”

The perks of the collaboration will become “especially attractive with the opening of the brand-new Museum ENTER building” next year, says IEEE Senior Member Hugo Wyss, chair of the Switzerland Section, who led the partnership effort.

Exhibits on gaming, inventors, and startups

The museum is set to move in May to a larger building in the village of Derendingen. When it reopens there in November, these are some new additions visitors can look forward to:

  • Audio guides, display cards, and pamphlets in German, English, and French.
  • “The Academy,” which aims to inspire the next generation of engineers, offering workshops, lectures, and other events, as well as access to a technical library.
  • A data digitization laboratory where collectors and electronics enthusiasts can convert vintage media carriers, records, and film.
  • A public-gathering piazza with an attached café and meeting rooms.

Electronic in foreground with a group of children watching adult use something in his hands in background The museum offers STEM workshops. ENTER Museum

In addition, these eight permanent exhibits will be available, the museum says:

  • Game Area. A display featuring innovations that have driven the rise of gaming and high-performance computing.
  • Hall of Brands. A showcase of technologies from well-known companies.
  • Now. Current technology highlighted in the news.
  • Show of Pioneers. A look at the inventors of popular consumer and computer electronics.
  • Switzerland Connected. A showcase for the country’s former and current accelerators, startups, and schools.
  • Time Travel. A retrospective look at 150 years of technology.
  • Typology of Technology. Applications such as optical and magnetic recording used for music and film.

The museum also plans to curate special exhibitions.

“We are going from being simply a museum with an extensive collection to being a center for networking, education, and innovation,” Vitacca says. “That’s why it’s important for the museum to collaborate with IEEE. Our offerings are not only unique in Switzerland but also across Europe. IEEE is a great partner for us to help get the word out about what we do.”


Match ID: 12 Score: 5.00 source: spectrum.ieee.org age: 8 days
qualifiers: 5.00 travel(|ing)

Alex on the rocks
Mon, 12 Sep 2022 13:47:00 +0200
Image:

ESA astronaut Alexander Gerst and NASA astronaut Stephanie Wilson are getting world-class geology training this week during the fifth edition of ESA’s Pangaea course.

A balanced mix of theory and field trips, the course will take the pair all over Europe to hone their geology skills. The training began last week in the Italian Dolomites with lessons on fundamental geology knowledge and skills, martian geology and asteroids at Bletterbach Canyon.

The rock samples from the canyon Alexander is holding in this image are a combination of gypsum (white hue) in siltstone-sandstone (reddish hue), and are analogous to rocks found on Mars.

This week, Alexander and Stephanie will follow the footsteps of Apollo astronauts to study the Ries crater in Germany, one of the best-preserved impact craters on Earth, where American crews trained before their flights to the Moon.

The course concludes the year with a trip to the volcanic landscapes of Lanzarote, Spain in November, to learn about the geological interactions between volcanic activity and water – two key factors in the search for life.

The final part of the course has the astronauts travel to Lofoten, Norway, to focus on rocks similar to the lunar highlands. These will be important locations to explore during the future Artemis missions, as they may hold key information for unravelling the history of the Moon and our Solar System.

The different field locations visited during Pangaea are used to train Alexander and Stephanie on how to read a landscape, collect scientifically relevant samples and effectively communicate their geological observations with teams back on Earth.

Alexander is a geophysicist, volcanologist and more recently International Space Station commander in 2018, and has seen 5700 sunrises and sunsets in space. Pangaea is challenging this seasoned space explorer to become a field scientist in preparation for future deep space missions, where the astronauts will be the eyes and ears of the scientific community on Earth.

Follow Alexander on Twitter for his takes on getting back in the classroom for Pangaea.


Match ID: 13 Score: 5.00 source: www.esa.int age: 12 days
qualifiers: 5.00 travel(|ing)

Deep Learning Could Bring the Concert Experience Home
Sat, 10 Sep 2022 15:00:00 +0000


Now that recorded sound has become ubiquitous, we hardly think about it. From our smartphones, smart speakers, TVs, radios, disc players, and car sound systems, it’s an enduring and enjoyable presence in our lives. In 2017, a survey by the polling firm Nielsen suggested that some 90 percent of the U.S. population listens to music regularly and that, on average, they do so 32 hours per week.

Behind this free-flowing pleasure are enormous industries applying technology to the long-standing goal of reproducing sound with the greatest possible realism. From Edison’s phonograph and the horn speakers of the 1880s, successive generations of engineers in pursuit of this ideal invented and exploited countless technologies: triode vacuum tubes, dynamic loudspeakers, magnetic phonograph cartridges, solid-state amplifier circuits in scores of different topologies, electrostatic speakers, optical discs, stereo, and surround sound. And over the past five decades, digital technologies, like audio compression and streaming, have transformed the music industry.

And yet even now, after 150 years of development, the sound we hear from even a high-end audio system falls far short of what we hear when we are physically present at a live music performance. At such an event, we are in a natural sound field and can readily perceive that the sounds of different instruments come from different locations, even when the sound field is criss-crossed with mixed sound from multiple instruments. There’s a reason why people pay considerable sums to hear live music: It is more enjoyable, exciting, and can generate a bigger emotional impact.

To hear the author's 3D Soundstage audio for yourself, grab your headphones and head over to 3dsoundstage.com/ieee.

Today, researchers, companies, and entrepreneurs, including ourselves, are closing in at last on recorded audio that truly re-creates a natural sound field. The group includes big companies, such as Apple and Sony, as well as smaller firms, such as Creative. Netflix recently disclosed a partnership with Sennheiser under which the network has begun using a new system, Ambeo 2-Channel Spatial Audio, to heighten the sonic realism of such TV shows as “Stranger Things” and “The Witcher.”

There are now at least half a dozen different approaches to producing highly realistic audio. We use the term “soundstage” to distinguish our work from other audio formats, such as the ones referred to as spatial audio or immersive audio. These can represent sound with more spatial effect than ordinary stereo, but they do not typically include the detailed sound-source location cues that are needed to reproduce a truly convincing sound field.

We believe that soundstage is the future of music recording and reproduction. But before such a sweeping revolution can occur, it will be necessary to overcome an enormous obstacle: that of conveniently and inexpensively converting the countless hours of existing recordings, regardless of whether they’re mono, stereo, or multichannel surround sound (5.1, 7.1, and so on). No one knows exactly how many songs have been recorded, but according to the entertainment-metadata concern Gracenote, more than 200 million recorded songs are available now on planet Earth. Given that the average duration of a song is about 3 minutes, this is the equivalent of about 1,100 years of music.

Measuring a Head-Related Transfer Function


To provide a high degree of spatial realism for a listener, you need to precisely map the details of how that listener’s unique head shape, ears, and nasal cavity affect how he or she hears sound. This is done by determining the listener’s head-related transfer function, which is accomplished by playing sounds from a variety of angles and recording how the user’s head affects the sounds at each position.


Image of a human body dummy on the floor atop a audio device.


Image of a human body dummy with labels showing the process of audio.


That is a lot of music. Any attempt to popularize a new audio format, no matter how promising, is doomed to fail unless it includes technology that makes it possible for us to listen to all this existing audio with the same ease and convenience with which we now enjoy stereo music—in our homes, at the beach, on a train, or in a car.

We have developed such a technology. Our system, which we call 3D Soundstage, permits music playback in soundstage on smartphones, ordinary or smart speakers, headphones, earphones, laptops, TVs, soundbars, and in vehicles. Not only can it convert mono and stereo recordings to soundstage, it also allows a listener with no special training to reconfigure a sound field according to their own preference, using a graphical user interface. For example, a listener can assign the locations of each instrument and vocal sound source and adjust the volume of each—changing the relative volume of, say, vocals in comparison with the instrumental accompaniment. The system does this by leveraging artificial intelligence (AI), virtual reality, and digital signal processing (more on that shortly).

To re-create convincingly the sound coming from, say, a string quartet in two small speakers, such as the ones available in a pair of headphones, requires a great deal of technical finesse. To understand how this is done, let’s start with the way we perceive sound.

When sound travels to your ears, unique characteristics of your head—its physical shape, the shape of your outer and inner ears, even the shape of your nasal cavities—change the audio spectrum of the original sound. Also, there is a very slight difference in the arrival time from a sound source to your two ears. From this spectral change and the time difference, your brain perceives the location of the sound source. The spectral changes and time difference can be modeled mathematically as head-related transfer functions (HRTFs). For each point in three-dimensional space around your head, there is a pair of HRTFs, one for your left ear and the other for the right.

So, given a piece of audio, we can process that audio using a pair of HRTFs, one for the right ear, and one for the left. To re-create the original experience, we would need to take into account the location of the sound sources relative to the microphones that recorded them. If we then played that processed audio back, for example through a pair of headphones, the listener would hear the audio with the original cues, and perceive that the sound is coming from the directions from which it was originally recorded.

If we don’t have the original location information, we can simply assign locations for the individual sound sources and get essentially the same experience. The listener is unlikely to notice minor shifts in performer placement—indeed, they might prefer their own configuration.


Even now, after 150 years of development, the sound we hear from even a high-end audio system falls far short of what we hear when we are physically present at a live music performance.


There are many commercial apps that use HRTFs to create spatial sound for listeners using headphones and earphones. One example is Apple’s Spatialize Stereo. This technology applies HRTFs to playback audio so you can perceive a spatial sound effect—a deeper sound field that is more realistic than ordinary stereo. Apple also offers a head-tracker version that uses sensors on the iPhone and AirPods to track the relative direction between your head, as indicated by the AirPods in your ears, and your iPhone. It then applies the HRTFs associated with the direction of your iPhone to generate spatial sounds, so you perceive that the sound is coming from your iPhone. This isn’t what we would call soundstage audio, because instrument sounds are still mixed together. You can’t perceive that, for example, the violin player is to the left of the viola player.

Apple does, however, have a product that attempts to provide soundstage audio: Apple Spatial Audio. It is a significant improvement over ordinary stereo, but it still has a couple of difficulties, in our view. One, it incorporates Dolby Atmos, a surround-sound technology developed by Dolby Laboratories. Spatial Audio applies a set of HRTFs to create spatial audio for headphones and earphones. However, the use of Dolby Atmos means that all existing stereophonic music would have to be remastered for this technology. Remastering the millions of songs already recorded in mono and stereo would be basically impossible. Another problem with Spatial Audio is that it can only support headphones or earphones, not speakers, so it has no benefit for people who tend to listen to music in their homes and cars.

So how does our system achieve realistic soundstage audio? We start by using machine-learning software to separate the audio into multiple isolated tracks, each representing one instrument or singer or one group of instruments or singers. This separation process is called upmixing. A producer or even a listener with no special training can then recombine the multiple tracks to re-create and personalize a desired sound field.

Consider a song featuring a quartet consisting of guitar, bass, drums, and vocals. The listener can decide where to “locate” the performers and can adjust the volume of each, according to his or her personal preference. Using a touch screen, the listener can virtually arrange the sound-source locations and the listener’s position in the sound field, to achieve a pleasing configuration. The graphical user interface displays a shape representing the stage, upon which are overlaid icons indicating the sound sources—vocals, drums, bass, guitars, and so on. There is a head icon at the center, indicating the listener’s position. The listener can touch and drag the head icon around to change the sound field according to their own preference.

Moving the head icon closer to the drums makes the sound of the drums more prominent. If the listener moves the head icon onto an icon representing an instrument or a singer, the listener will hear that performer as a solo. The point is that by allowing the listener to reconfigure the sound field, 3D Soundstage adds new dimensions (if you’ll pardon the pun) to the enjoyment of music.

The converted soundstage audio can be in two channels, if it is meant to be heard through headphones or an ordinary left- and right-channel system. Or it can be multichannel, if it is destined for playback on a multiple-speaker system. In this latter case, a soundstage audio field can be created by two, four, or more speakers. The number of distinct sound sources in the re-created sound field can even be greater than the number of speakers.

An Audio Taxonomy


Image of a chart showing multiple audio types and examples of audio types.

For a listener seeking a high degree of spatial realism, a variety of audio formats and systems are now available for enjoyment through speakers or headphones. On the low end, ordinary mono and stereo recordings provide a minimal spatial-perceptual experience. In the middle range, multichannel recordings, such as 5.1 and 7.1 surround sound, offer somewhat higher levels of spatial realism. At the highest levels are audio systems that start with the individual, separated instrumental tracks of a recording and recombine them, using audio techniques and tools such as head-related transfer functions, to provide a highly realistic spatial experience.



This multichannel approach should not be confused with ordinary 5.1 and 7.1 surround sound. These typically have five or seven separate channels and a speaker for each, plus a subwoofer (the “.1”). The multiple loudspeakers create a sound field that is more immersive than a standard two-speaker stereo setup, but they still fall short of the realism possible with a true soundstage recording. When played through such a multichannel setup, our 3D Soundstage recordings bypass the 5.1, 7.1, or any other special audio formats, including multitrack audio-compression standards.

A word about these standards. In order to better handle the data for improved surround-sound and immersive-audio applications, new standards have been developed recently. These include the MPEG-H 3D audio standard for immersive spatial audio with Spatial Audio Object Coding (SAOC). These new standards succeed various multichannel audio formats and their corresponding coding algorithms, such as Dolby Digital AC-3 and DTS, which were developed decades ago.

While developing the new standards, the experts had to take into account many different requirements and desired features. People want to interact with the music, for example by altering the relative volumes of different instrument groups. They want to stream different kinds of multimedia, over different kinds of networks, and through different speaker configurations. SAOC was designed with these features in mind, allowing audio files to be efficiently stored and transported, while preserving the possibility for a listener to adjust the mix based on their personal taste.

To do so, however, it depends on a variety of standardized coding techniques. To create the files, SAOC uses an encoder. The inputs to the encoder are data files containing sound tracks; each track is a file representing one or more instruments. The encoder essentially compresses the data files, using standardized techniques. During playback, a decoder in your audio system decodes the files, which are then converted back to the multichannel analog sound signals by digital-to-analog converters.

Our 3D Soundstage technology bypasses this. We use mono or stereo or multichannel audio data files as input. We separate those files or data streams into multiple tracks of isolated sound sources, and then convert those tracks to two-channel or multichannel output, based on the listener’s preferred configurations, to drive headphones or multiple loudspeakers. We use AI technology to avoid multitrack rerecording, encoding, and decoding.

In fact, one of the biggest technical challenges we faced in creating the 3D Soundstage system was writing that machine-learning software that separates (or upmixes) a conventional mono, stereo, or multichannel recording into multiple isolated tracks in real time. The software runs on a neural network. We developed this approach for music separation in 2012 and described it in patents that were awarded in 2022 and 2015 (the U.S. patent numbers are 11,240,621 B2 and 9,131,305 B2).


The listener can decide where to “locate” the performers and can adjust the volume of each, according to his or her personal preference.


A typical session has two components: training and upmixing. In the training session, a large collection of mixed songs, along with their isolated instrument and vocal tracks, are used as the input and target output, respectively, for the neural network. The training uses machine learning to optimize the neural-network parameters so that the output of the neural network—the collection of individual tracks of isolated instrument and vocal data—matches the target output.

A neural network is very loosely modeled on the brain. It has an input layer of nodes, which represent biological neurons, and then many intermediate layers, called “hidden layers.” Finally, after the hidden layers there is an output layer, where the final results emerge. In our system, the data fed to the input nodes is the data of a mixed audio track. As this data proceeds through layers of hidden nodes, each node performs computations that produce a sum of weighted values. Then a nonlinear mathematical operation is performed on this sum. This calculation determines whether and how the audio data from that node is passed on to the nodes in the next layer.

There are dozens of these layers. As the audio data goes from layer to layer, the individual instruments are gradually separated from one another. At the end, in the output layer, each separated audio track is output on a node in the output layer.

That’s the idea, anyway. While the neural network is being trained, the output may be off the mark. It might not be an isolated instrumental track—it might contain audio elements of two instruments, for example. In that case, the individual weights in the weighting scheme used to determine how the data passes from hidden node to hidden node are tweaked and the training is run again. This iterative training and tweaking goes on until the output matches, more or less perfectly, the target output.

As with any training data set for machine learning, the greater the number of available training samples, the more effective the training will ultimately be. In our case, we needed tens of thousands of songs and their separated instrumental tracks for training; thus, the total training music data sets were in the thousands of hours.

After the neural network is trained, given a song with mixed sounds as input, the system outputs the multiple separated tracks by running them through the neural network using the system established during training.

Unmixing Audio With a Neural Network


A diagram depicts a neural network being used to separate a piece of audio into its component tracks.

After separating a recording into its component tracks, the next step is to remix them into a soundstage recording. This is accomplished by a soundstage signal processor. This soundstage processor performs a complex computational function to generate the output signals that drive the speakers and produce the soundstage audio. The inputs to the generator include the isolated tracks, the physical locations of the speakers, and the desired locations of the listener and sound sources in the re-created sound field. The outputs of the soundstage processor are multitrack signals, one for each channel, to drive the multiple speakers.

The sound field can be in a physical space, if it is generated by speakers, or in a virtual space, if it is generated by headphones or earphones. The function performed within the soundstage processor is based on computational acoustics and psychoacoustics, and it takes into account sound-wave propagation and interference in the desired sound field and the HRTFs for the listener and the desired sound field.

For example, if the listener is going to use earphones, the generator selects a set of HRTFs based on the configuration of desired sound-source locations, then uses the selected HRTFs to filter the isolated sound-source tracks. Finally, the soundstage processor combines all the HRTF outputs to generate the left and right tracks for earphones. If the music is going to be played back on speakers, at least two are needed, but the more speakers, the better the sound field. The number of sound sources in the re-created sound field can be more or less than the number of speakers.

We released our first soundstage app, for the iPhone, in 2020. It lets listeners configure, listen to, and save soundstage music in real time—the processing causes no discernible time delay. The app, called 3D Musica, converts stereo music from a listener’s personal music library, the cloud, or even streaming music to soundstage in real time. (For karaoke, the app can remove vocals, or output any isolated instrument.)

Earlier this year, we opened a Web portal, 3dsoundstage.com, that provides all the features of the 3D Musica app in the cloud plus an application programming interface (API) making the features available to streaming music providers and even to users of any popular Web browser. Anyone can now listen to music in soundstage audio on essentially any device.

When sound travels to your ears, unique characteristics of your head—its physical shape, the shape of your outer and inner ears, even the shape of your nasal cavities—change the audio spectrum of the original sound.

We also developed separate versions of the 3D Soundstage software for vehicles and home audio systems and devices to re-create a 3D sound field using two, four, or more speakers. Beyond music playback, we have high hopes for this technology in videoconferencing. Many of us have had the fatiguing experience of attending videoconferences in which we had trouble hearing other participants clearly or being confused about who was speaking. With soundstage, the audio can be configured so that each person is heard coming from a distinct location in a virtual room. Or the “location” can simply be assigned depending on the person’s position in the grid typical of Zoom and other videoconferencing applications. For some, at least, videoconferencing will be less fatiguing and speech will be more intelligible.

Just as audio moved from mono to stereo, and from stereo to surround and spatial audio, it is now starting to move to soundstage. In those earlier eras, audiophiles evaluated a sound system by its fidelity, based on such parameters as bandwidth, harmonic distortion, data resolution, response time, lossless or lossy data compression, and other signal-related factors. Now, soundstage can be added as another dimension to sound fidelity—and, we dare say, the most fundamental one. To human ears, the impact of soundstage, with its spatial cues and gripping immediacy, is much more significant than incremental improvements in fidelity. This extraordinary feature offers capabilities previously beyond the experience of even the most deep-pocketed audiophiles.

Technology has fueled previous revolutions in the audio industry, and it is now launching another one. Artificial intelligence, virtual reality, and digital signal processing are tapping in to psychoacoustics to give audio enthusiasts capabilities they’ve never had. At the same time, these technologies are giving recording companies and artists new tools that will breathe new life into old recordings and open up new avenues for creativity. At last, the century-old goal of convincingly re-creating the sounds of the concert hall has been achieved.


Match ID: 14 Score: 5.00 source: spectrum.ieee.org age: 14 days
qualifiers: 5.00 travel(|ing)

Sunrise for the Moon
Fri, 26 Aug 2022 13:24:00 +0200
Image:

The Orion spacecraft with integrated European Service Module sit atop the Space Launch System, imaged at sunrise at historic Launchpad 39B at Kennedy Space Center in Florida, USA.

The Flight Readiness Review has deemed the trio GO for launch, marking the dawn of a new era in space exploration.

The first in a series of missions that will return humans to the Moon, including taking the first European, Artemis I is scheduled for launch no earlier than Monday 29 August, at 14:33 CEST.

This mission will put NASA’s Orion spacecraft and ESA’s European Service Module to the test during a journey beyond the Moon and back. No crew will be on board Orion this time, and the spacecraft will be controlled by teams on Earth.

The crew module, however, won’t be empty. Two mannequins, named Helga and Zohar, will occupy the passenger seats. Their female-shaped plastic bodies are filled with over 5600 sensors each to measure the radiation load during their trip around the Moon. The specially trained woolly astronaut, Shaun the Sheep, has also been assigned a seat.

The spacecraft will enter lunar orbit using the Moon’s gravity to gain speed and propel itself almost half a million km from Earth – farther than any human-rated spacecraft has ever travelled.

The second Artemis mission will see four astronauts travel around the Moon on a flyby voyage around our natural satellite.

Mission duration depends on the launch date and even time. It will last between 20 to 40 days, depending on how many orbits of the Moon mission designers decide to make.

This flexibility in mission length is necessary to allow the mission to end as intended with a splashdown during daylight hours in the Pacific Ocean, off the coast of California, USA.

Two more dates are available if a launch on 29 August is not possible. The Artemis Moon mission can also be launched on 2 September and 5 September. Check all the possible launch options on ESA’s Orion blog.

Orion is the only spacecraft capable of human spaceflight outside Earth orbit and high-speed reentry from the vicinity of the Moon. More than just a crew module, Orion includes the European Service Module (ESM), the powerhouse that fuels and propels Orion.

ESM provides for all astronauts’ basic needs, such as water, oxygen, nitrogen, temperature control, power and propulsion. Much like a train engine pulls passenger carriages and supplies power, the European Service Module will take the Orion capsule to its destination and back.

Watch launch coverage on ESA Web TV starting at 12:30 CEST here. Follow @esaspaceflight for updates and live Twitter coverage.


Match ID: 15 Score: 5.00 source: www.esa.int age: 29 days
qualifiers: 5.00 travel(|ing)

Solar-to-Jet-Fuel System Readies for Takeoff
Wed, 03 Aug 2022 17:00:00 +0000


As climate change edges from crisis to emergency, the aviation sector looks set to miss its 2050 goal of net-zero emissions. In the five years preceding the pandemic, the top four U.S. airlines—American, Delta, Southwest, and United—saw a 15 percent increase in the use of jet fuel. Despite continual improvements in engine efficiencies, that number is projected to keep rising.

A glimmer of hope, however, comes from solar fuels. For the first time, scientists and engineers at the Swiss Federal Institute of Technology (ETH) in Zurich have reported a successful demonstration of an integrated fuel-production plant for solar kerosene. Using concentrated solar energy, they were able to produce kerosene from water vapor and carbon dioxide directly from air. Fuel thus produced is a drop-in alternative to fossil-derived fuels and can be used with existing storage and distribution infrastructures, and engines.

Fuels derived from synthesis gas (or syngas)—an intermediate product that is a specific mixture of carbon monoxide and hydrogen—is a known alternative to conventional, fossil-derived fuels. Syngas is produced by Fischer-Tropsch (FT) synthesis, in which chemical reactions convert carbon monoxide and water vapor into hydrocarbons. The team of researchers at ETH found that a solar-driven thermochemical method to split water and carbon dioxide using a metal oxide redox cycle can produce renewable syngas. They demonstrated the process in a rooftop solar refinery at the ETH Machine Laboratory in 2019.

Close-up of a spongy looking material Reticulated porous structure made of ceria used in the solar reactor to thermochemically split CO2 and H2O and produce syngas, a specific mixture of H2 and CO.ETH Zurich

The current pilot-scale solar tower plant was set up at the IMDEA Energy Institute in Spain. It scales up the solar reactor of the 2019 experiment by a factor of 10, says Aldo Steinfeld, an engineering professor at ETH who led the study. The fuel plant brings together three subsystems—the solar tower concentrating facility, solar reactor, and gas-to-liquid unit.

First, a heliostat field made of mirrors that rotate to follow the sun concentrates solar irradiation into a reactor mounted on top of the tower. The reactor is a cavity receiver lined with reticulated porous ceramic structures made of ceria (or cerium(IV) oxide). Within the reactor, the concentrated sunlight creates a high-temperature environment of about 1,500 °C which is hot enough to split captured carbon dioxide and water from the atmosphere to produce syngas. Finally, the syngas is processed to kerosene in the gas-to-liquid unit. A centralized control room operates the whole system.

Fuel produced using this method closes the fuel carbon cycle as it only produces as much carbon dioxide as has gone into its manufacture. “The present pilot fuel plant is still a demonstration facility for research purposes,” says Steinfeld, “but it is a fully integrated plant and uses a solar-tower configuration at a scale that is relevant for industrial implementation.”

“The solar reactor produced syngas with selectivity, purity, and quality suitable for FT synthesis,” the authors noted in their paper. They also reported good material stability for multiple consecutive cycles. They observed a value of 4.1 percent solar-to-syngas energy efficiency, which Steinfeld says is a record value for thermochemical fuel production, even though better efficiencies are required to make the technology economically competitive.

Schematic of the solar tower fuel plant.  A heliostat field concentrates solar radiation onto a solar reactor mounted on top of the solar tower. The solar reactor cosplits water and carbon dioxide and produces a mixture of molecular hydrogen and carbon monoxide, which in turn is processed to drop-in fuels such as kerosene.ETH Zurich

“The measured value of energy conversion efficiency was obtained without any implementation of heat recovery,” he says. The heat rejected during the redox cycle of the reactor accounted for more than 50 percent of the solar-energy input. “This fraction can be partially recovered via thermocline heat storage. Thermodynamic analyses indicate that sensible heat recovery could potentially boost the energy efficiency to values exceeding 20 percent.”

To do so, more work is needed to optimize the ceramic structures lining the reactor, something the ETH team is actively working on, by looking at 3D-printed structures for improved volumetric radiative absorption. “In addition, alternative material compositions, that is, perovskites or aluminates, may yield improved redox capacity, and consequently higher specific fuel output per mass of redox material,” Steinfeld adds.

The next challenge for the researchers, he says, is the scale-up of their technology for higher solar-radiative power inputs, possibly using an array of solar cavity-receiver modules on top of the solar tower.

To bring solar kerosene into the market, Steinfeld envisages a quota-based system. “Airlines and airports would be required to have a minimum share of sustainable aviation fuels in the total volume of jet fuel that they put in their aircraft,” he says. This is possible as solar kerosene can be mixed with fossil-based kerosene. This would start out small, as little as 1 or 2 percent, which would raise the total fuel costs at first, though minimally—adding “only a few euros to the cost of a typical flight,” as Steinfeld puts it

Meanwhile, rising quotas would lead to investment, and to falling costs, eventually replacing fossil-derived kerosene with solar kerosene. “By the time solar jet fuel reaches 10 to 15 percent of the total jet-fuel volume, we ought to see the costs for solar kerosene nearing those of fossil-derived kerosene,” he adds.

However, we may not have to wait too long for flights to operate solely on solar fuel. A commercial spin-off of Steinfeld’s laboratory, Synhelion, is working on commissioning the first industrial-scale solar fuel plant in 2023. The company has also collaborated with the airline SWISS to conduct a flight solely using its solar kerosene.


Match ID: 16 Score: 5.00 source: spectrum.ieee.org age: 52 days
qualifiers: 5.00 travel(|ing)

X-Rays Could Carry Quantum Signals Across the Stars
Mon, 18 Jul 2022 15:07:14 +0000


Quantum signals may possess a number of advantages over regular forms of communication, leading scientists to wonder if humanity was not alone in discovering such benefits. Now a new study suggests that, for hypothetical extraterrestrial civilizations, quantum transmissions using X-rays may be possible across interstellar distances.

Quantum communication relies on a quantum phenomenon known as entanglement. Essentially, two or more particles such as photons that get “linked” via entanglement can, in theory, influence each other instantly no matter how far apart they are.

Entanglement is essential to quantum teleportation, in which data can essentially disappear one place and reappear someplace else. Since this information does not travel across the intervening space, there is no chance the information will be lost.

To accomplish quantum teleportation, one would first entangle two photons. Then, one of the photons—the one to be teleported—is kept at one location while the other is beamed to whatever destination is desired.

Next, the photon at the destination's quantum state—which defines its key characteristics—is analyzed, an act that also destroys its quantum state. Entanglement will lead the destination photon to prove identical to its partner. For all intents and purposes, the photon at the origin point “teleported” to the destination point—no physical matter moved, but the two photons are physically indistinguishable.

And to be clear, quantum teleportation cannot send information faster than the speed of light, because the destination photon must still be transmitted via conventional means.

One weakness of quantum communication is that entanglement is fragile. Still, researchers have successfully transmitted entangled photons that remained stable or “coherent” enough for quantum teleportation across distances as great as 1,400 kilometers.

Such findings led theoretical physicist Arjun Berera at the University of Edinburgh to wonder just how far quantum signals might stay coherent. First, he discovered quantum coherence might survive interstellar distances within our galaxy, and then he and his colleagues found quantum coherence might survive intergalactic distances.

“If photons in Earth’s atmosphere don’t decohere to 100 km, then in interstellar space where the medium is much less dense then our atmosphere, photons won’t decohere up to even the size of the galaxy,” Berera says.

In the new study, the researchers investigated whether and how well quantum communication might survive interstellar distances. Quantum signals might face disruption from a number of factors, such as the gravitational pull of interstellar bodies, they note.

The scientists discovered the best quantum communication channels for interstellar messages are X-rays. Such frequencies are easier to focus and detect across interstellar distances. (NASA has tested deep-space X-ray communication with its XCOM experiment.) The researchers also found that the optical and microwave bands could enable communication across large distances as well, albeit less effectively than X-rays.

Although coherence might survive interstellar distances, Berera does note quantum signals might lose fidelity. “This means the quantum state is sustained, but it can have a phase shift, so although the quantum information is preserved in these states, it has been altered by the effect of gravity.” Therefore, it may “take some work at the receiving end to account for these phase shifts and be able to assess the information contained in the original state.”

Why might an interstellar civilization transmit quantum signals as opposed to regular ones? The researchers note that quantum communication may allow greater data compression and, in some cases, exponentially faster speeds than classical channels. Such a boost in efficiency might prove very useful for civilizations separated by interstellar distances.

“It could be that quantum communication is the main communication mode in an extraterrestrial's world, so they just apply what is at hand to send signals into the cosmos,” Berera says.

The scientists detailed their findings online 28 June in the journal Physical Review D.


Match ID: 17 Score: 5.00 source: spectrum.ieee.org age: 68 days
qualifiers: 5.00 travel(|ing)

The Webb Space Telescope’s Profound Data Challenges
Fri, 08 Jul 2022 18:03:45 +0000


For a deep dive into the engineering behind the James Webb Space Telescope, see our collection of posts here.

When the James Webb Space Telescope (JWST) reveals its first images on 12 July, they will be the by-product of carefully crafted mirrors and scientific instruments. But all of its data-collecting prowess would be moot without the spacecraft’s communications subsystem.

The Webb’s comms aren’t flashy. Rather, the data and communication systems are designed to be incredibly, unquestionably dependable and reliable. And while some aspects of them are relatively new—it’s the first mission to use Ka-band frequencies for such high data rates so far from Earth, for example—above all else, JWST’s comms provide the foundation upon which JWST’s scientific endeavors sit.


As previous articles in this series have noted, JWST is parked at Lagrange point L2. It’s a point of gravitational equilibrium located about 1.5 million kilometers beyond Earth on a straight line between the planet and the sun. It’s an ideal location for JWST to observe the universe without obstruction and with minimal orbital adjustments.

Being so far away from Earth, however, means that data has farther to travel to make it back in one piece. It also means the communications subsystem needs to be reliable, because the prospect of a repair mission being sent to address a problem is, for the near term at least, highly unlikely. Given the cost and time involved, says Michael Menzel, the mission systems engineer for JWST, “I would not encourage a rendezvous and servicing mission unless something went wildly wrong.”

According to Menzel, who has worked on JWST in some capacity for over 20 years, the plan has always been to use well-understood K a-band frequencies for the bulky transmissions of scientific data. Specifically, JWST is transmitting data back to Earth on a 25.9-gigahertz channel at up to 28 megabits per second. The Ka-band is a portion of the broader K-band (another portion, the Ku-band, was also considered).

An illustration depicting different Lagrange points and where the Webb Telescope is. The Lagrange points are equilibrium locations where competing gravitational tugs on an object net out to zero. JWST is one of three craft currently occupying L2 (Shown here at an exaggerated distance from Earth). IEEE Spectrum

Both the data-collection and transmission rates of JWST dwarf those of the older Hubble Space Telescope. Compared to Hubble, which is still active and generates 1 to 2 gigabytes of data daily, JWST can produce up to 57 GB each day (although that amount is dependent on what observations are scheduled).

Menzel says he first saw the frequency selection proposals for JWST around 2000, when he was working at Northrop Grumman. He became the mission systems engineer in 2004. “I knew where the risks were in this mission. And I wanted to make sure that we didn’t get any new risks,” he says.

IEEE Spectrum

Besides, K a-band frequencies can transmit more data than X-band (7 to 11.2 GHz) or S-band (2 to 4 GHz), common choices for craft in deep space. A high data rate is a necessity for the scientific work JWST will be undertaking. In addition, according to Carl Hansen, a flight systems engineer at the Space Telescope Science Institute (the science operations center for JWST), a comparable X-band antenna would be so large that the spacecraft would have trouble remaining steady for imaging.

Although the 25.9-GHz K a-band frequency is the telescope’s workhorse communication channel, it also employs two channels in the S-band. One is the 2.09-GHz uplink that ferries future transmission and scientific observation schedules to the telescope at 16 kilobits per second. The other is the 2.27-GHz, 40-kb/s downlink over which the telescope transmits engineering data—including its operational status, systems health, and other information concerning the telescope’s day-to-day activities.

Any scientific data the JWST collects during its lifetime will need to be stored on board, because the spacecraft doesn’t maintain round-the-clock contact with Earth. Data gathered from its scientific instruments, once collected, is stored within the spacecraft’s 68-GB solid-state drive (3 percent is reserved for engineering and telemetry data). Alex Hunter, also a flight systems engineer at the Space Telescope Science Institute, says that by the end of JWST’s 10-year mission life, they expect to be down to about 60 GB because of deep-space radiation and wear and tear.

The onboard storage is enough to collect data for about 24 hours before it runs out of room. Well before that becomes an issue, JWST will have scheduled opportunities to beam that invaluable data to Earth.

JWST will stay connected via the Deep Space Network (DSN)—a resource it shares with the Parker Solar Probe, Transiting Exoplanet Survey Satellite, the Voyager probes, and the entire ensemble of Mars rovers and orbiters, to name just a few of the other heavyweights. The DSN consists of three antenna complexes: Canberra, Australia; Madrid, Spain; and Barstow, Calif. JWST needs to share finite antenna time with plenty of other deep-space missions, each with unique communications needs and schedules.

IEEE Spectrum

Sandy Kwan, a DSN systems engineer, says that contact windows with spacecraft are scheduled 12 to 20 weeks in advance. JWST had a greater number of scheduled contact windows during its commissioning phase, as instruments were brought on line, checked, and calibrated. Most of that process required real-time communication with Earth.

All of the communications channels use the Reed-Solomon error-correction protocol—the same error-correction standard as used in DVDs and Blu-ray discs as well as QR codes. The lower data-rate S-band channels use binary phase-shift key modulation—involving phase shifting of a signal’s carrier wave. The K-band channel, however, uses a quadrature phase-shift key modulation. Quadrature phase-shift keying can double a channel’s data rate, at the cost of more complicated transmitters and receivers.

JWST’s communications with Earth incorporate an acknowledgement protocol—only after the JWST gets confirmation that a file has been successfully received will it go ahead and delete its copy of the data to clear up space.

The communications subsystem was assembled along with the rest of the spacecraft bus by Northrop Grumman, using off-the-shelf components sourced from multiple manufacturers.

JWST has had a long and often-delayed development, but its communications system has always been a bedrock for the rest of the project. Keeping at least one system dependable means it’s one less thing to worry about. Menzel can remember, for instance, ideas for laser-based optical systems that were invariably rejected. “I can count at least two times where I had been approached by people who wanted to experiment with optical communications,” says Menzel. “Each time they came to me, I sent them away with the old ‘Thank you, but I don’t need it. And I don’t want it.’”


Match ID: 18 Score: 5.00 source: spectrum.ieee.org age: 77 days
qualifiers: 5.00 travel(|ing)

Pentagon Aims to Demo a Nuclear Spacecraft Within 5 Years
Thu, 09 Jun 2022 16:44:41 +0000


In the latest push for nuclear power in space, the Pentagon’s Defense Innovation Unit (DIU) awarded a contract in May to Seattle-based Ultra Safe Nuclear to advance its nuclear power and propulsion concepts. The company is making a soccer ball–size radioisotope battery it calls EmberCore. The DIU’s goal is to launch the technology into space for demonstration in 2027.

Ultra Safe Nuclear’s system is intended to be lightweight, scalable, and usable as both a propulsion source and a power source. It will be specifically designed to give small-to-medium-size military spacecraft the ability to maneuver nimbly in the space between Earth orbit and the moon. The DIU effort is part of the U.S. military’s recently announced plans to develop a surveillance network in cislunar space.

Besides speedy space maneuvers, the DIU wants to power sensors and communication systems without having to worry about solar panels pointing in the right direction or batteries having enough charge to work at night, says Adam Schilffarth, director of strategy at Ultra Safe Nuclear. “Right now, if you are trying to take radar imagery in Ukraine through cloudy skies,” he says, “current platforms can only take a very short image because they draw so much power.”

Radioisotope power sources are well suited for small, uncrewed spacecraft, adds Christopher Morrison, who is leading EmberCore’s development. Such sources rely on the radioactive decay of an element that produces energy, as opposed to nuclear fission, which involves splitting atomic nuclei in a controlled chain reaction to release energy. Heat produced by radioactive decay is converted into electricity using thermoelectric devices.

Radioisotopes have provided heat and electricity for spacecraft since 1961. The Curiosity and Perseverance rovers on Mars, and deep-space missions including Cassini, New Horizons, and Voyager all use radioisotope batteries that rely on the decay of plutonium-238, which is nonfissile—unlike plutonium-239, which is used in weapons and power reactors.

For EmberCore, Ultra Safe Nuclear has instead turned to medical isotopes such as cobalt-60 that are easier and cheaper to produce. The materials start out inert, and have to be charged with neutrons to become radioactive. The company encapsulates the material in a proprietary ceramic for safety.

Cobalt-60 has a half-life of five years (compared to plutonium-238’s 90 years), which is enough for the cislunar missions that the DOD and NASA are looking at, Morrison says. He says that EmberCore should be able to provide 10 times as much power as a plutonium-238 system, providing over 1 million kilowatt-hours of energy using just a few pounds of fuel. “This is a technology that is in many ways commercially viable and potentially more scalable than plutonium-238,” he says.

One downside of the medical isotopes is that they can produce high-energy X-rays in addition to heat. So Ultra Safe Nuclear wraps the fuel with a radiation-absorbing metal shield. But in the future, the EmberCore system could be designed for scientists to use the X-rays for experiments. “They buy this heater and get an X-ray source for free,” says Schilffarth. “We’ve talked with scientists who right now have to haul pieces of lunar or Martian regolith up to their sensor because the X-ray source is so weak. Now we’re talking about a spotlight that could shine down to do science from a distance.”

Ultra Safe Nuclear’s contract is one of two awarded by the DIU—which aims to speed up the deployment of commercial technology through military use—to develop nuclear power and propulsion for spacecraft. The other contract was awarded to Avalanche Energy, which is making a lunchbox-size fusion device it calls an Orbitron. The device will use electrostatic fields to trap high-speed ions in slowly changing orbits around a negatively charged cathode. Collisions between the ions can result in fusion reactions that produce energetic particles.

Both companies will use nuclear energy to power high-efficiency electric propulsion systems. Electric propulsion technologies such as ion thrusters, which use electromagnetic fields to accelerate ions and generate thrust, are more efficient than chemical rockets, which burn fuel. Solar panels typically power the ion thrusters that satellites use today to change their position and orientation. Schilffarth says that the higher power from EmberCore should give a greater velocity change of 10 kilometers per second in orbit than today’s electric propulsion systems.

Ultra Safe Nuclear is also one of three companies developing nuclear fission thermal propulsion systems for NASA and the Department of Energy. Meanwhile, the Defense Advanced Research Projects Agency (DARPA) is seeking companies to develop a fission-based nuclear thermal rocket engine, with demonstrations expected in 2026.

This article appears in the August 2022 print issue as “Spacecraft to Run on Radioactive Decay.”


Match ID: 19 Score: 5.00 source: spectrum.ieee.org age: 107 days
qualifiers: 5.00 travel(|ing)

How Flyback Rocket Boosters Got Off the Ground
Mon, 21 Mar 2022 20:27:59 +0000


In the popular conception of a technological breakthrough, a flash of genius is followed quickly by commercial or industrial success, public acclaim, and substantial wealth for a small group of inventors and backers. In the real world, it almost never works out that way.

Advances that seem to appear suddenly are often backed by decades of development. Consider steam engines. Starting in the second quarter of the 19th century they began powering trains, and they soon revolutionized the transportation of people and goods. But steam engines themselves had been invented at the beginning of the 18th century. For 125 years they had been used to pump water out of mines and then to power the mills of the Industrial Revolution.


Lately we’ve become accustomed to seeing rocket boosters return to Earth and then land vertically, on their tails, ready to be serviced and flown again. (Much the same majestic imagery thrilled sci-fi moviegoers in the 1950s.) Today, both SpaceX and Blue Origin are using these techniques, and a third startup, Relativity Space, is on the verge of joining them. Such reusable rocketry is already cutting the cost of access to space and, with other advances yet to come, will help make it possible for humanity to return to the moon and eventually to travel to Mars.

Vertical landings, too, have a long history, with the same ground being plowed many times by multiple research organizations. From 1993 to 1996 a booster named DCX, for Delta Clipper Experimental, took off and landed vertically eight times at White Sands Missile Range. It flew to a height of only 2,500 meters, but it successfully negotiated the very tricky dynamics of landing a vertical cylinder on its end.

The key innovations that made all this possible happened 50 or more years ago. And those in turn built upon the invention a century ago of liquid-fueled rockets that can be throttled up or down by pumping more or less fuel into a combustion chamber.

In August 1954 the Rolls-Royce Thrust Measuring Rig, also known as the “flying bedstead,” took off and landed vertically while carrying a pilot. The ungainly contraption had two downward-pointing Rolls-Royce jet engines with nozzles that allowed the pilot to vector the thrust and control the flight. By 1957 another company, Hawker Siddeley, started work on turning this idea into a vertical take-off and landing (VTOL) fighter jet. It first flew in 1967 and entered service in 1969 as the Harrier Jump Jet, with new Rolls-Royce engines specifically designed for thrust vectoring. Thrust vectoring is a critical component of control for all of today’s reusable rocket boosters.

During the 1960s another rig, also nicknamed the flying bedstead, was developed in the United States for training astronauts to land on the moon. There was a gimbaled rocket engine that always pointed directly downward, providing thrust equal to five-sixths of the vehicle and the pilot’s weight, simulating lunar gravity. The pilot then controlled the thrust and direction of another rocket engine to land the vehicle safely.

It was not all smooth flying. Neil Armstrong first flew the trainer in March 1967, but he was nearly killed in May 1968 when things went awry and he had to use the ejection seat to rocket to safety. The parachute deployed and he hit the ground just 4 seconds later. Rocket-powered vertical descent was harder than it looked.

Vertical rocket landings have a long history, with the same ground being plowed many times by multiple research organizations.

Nevertheless, between 1969 and 1972, Armstrong and then five other astronauts piloted lunar modules to vertical landings on the moon. There were no ejection seats, and these have been the only crewed rocket-powered landings on a spaceflight. All other humans lofted into space have used Earth’s atmosphere to slow down, combining heat shields with either wings or parachutes.

In the early days of Blue Origin, the company returned to the flying-bedstead approach, and its vehicle took off and landed successfully in March 2005. It was powered by four jet engines, once again from Rolls-Royce, bought secondhand from the South African Air Force. Ten years later, in November 2015, Blue Origin’s New Shepard booster reached an altitude of 100 kilometers and then landed vertically. A month later SpaceX had its first successful vertical landing of a Falcon-9 booster.

Today’s reusable, or flyback, boosters also use something called grid fins, those honeycombed panels sticking out perpendicularly from the top of a booster that guide the massive cylinder as it falls through the atmosphere unpowered. The fins have an even longer history, as they have been part of every crewed Soyuz launch since the 1960s. They guide the capsule back to Earth if there’s an abort during the climb to orbit. They were last used in October 2018 when a Soyuz failed at 50 km up. The cosmonaut and astronaut who were aboard landed safely and had a successful launch in another Soyuz five months later.

The next big accomplishment will be crewed vertical landings, 50 years after mankind's last one, on the moon. It will almost certainly happen before this decade is out.

I’m less confident that we’ll see general-purpose quantum computers and abundant electricity from nuclear fusion in that time frame. But I’m pretty sure we’ll eventually get there with both. The arc of technology development is often long. And sometimes, the longer it is, the more revolutionary it is in the end.

This article appears in the April 2022 print issue as “The Long Road to Overnight Success .”


Match ID: 20 Score: 5.00 source: spectrum.ieee.org age: 186 days
qualifiers: 5.00 travel(|ing)

To Catch a Falling Satellite
Mon, 14 Mar 2022 16:55:14 +0000


It is the fate of many a dead satellite to spend its last years tumbling out of control. A fuel line may burst, or solar wind may surge, or there may be drag from the outer reaches of the atmosphere—and unless a spacecraft has been designed in some way that keeps it naturally stable, chances are good that it will begin to turn end over end.

That’s a problem, because Earth orbit is getting more and more crowded. Engineers would like to corral old pieces of space junk, but they can’t safely reach them, especially if they’re unstable. The European Space Agency says there are about 30,000 “debris objects” now being tracked in Earth orbit—derelict satellites, spent rocket stages, pieces sent flying from collisions in space. There may also be 900,000 smaller bits of orbital debris—everything from loose bolts to flecks of paint to shards of insulation. They may be less than 10 centimeters long, but they can still destroy a healthy satellite if they hit at orbital speeds.

“With more satellites being launched, we might encounter more situations where we have a defunct satellite that’s occupying a valuable orbit,” says Richard Linares, an assistant professor of aeronautics and astronautics at MIT. He’s part of an American-German project, called TumbleDock/ROAM, researching ways to corral and stabilize tumbling satellites so they can be deorbited or, in some cases, perhaps even refueled or repaired.

Engineers have put up with orbital debris for decades, but Linares says the picture is changing. For one thing, satellite technology is becoming more and more affordable—just look at SpaceX, which has been launching 40 satellites a week so far this year. For another, he says, the economic benefits those satellites offer—high-speed internet, GPS, climate and crop monitoring and other applications—will be threatened if the risk of impacts keeps growing.

“I think in the next few years we’ll have the technology to do something about space debris,” says Linares. “And there are economic drivers that will incentivize companies to do this.”

The TumbleDock/ROAM team has just finished a series of tests in the cabin of the International Space Station, using NASA robots called Astrobees to stand in for a tumbling satellite and a “chaser” spacecraft sent to catch it. The goal: to figure out algorithms so that a chaser can find its target, determine its tumble rates, and calculate the safest and most efficient approach to it.

Astrobee robot experiment aboard the ISS to reach a tumbling target in space. www.youtube.com

“There’s a massive amount of large debris out there,” says Keenan Albee, a Ph.D. student on the team at MIT. “Look at some of them, with large solar panels that are ready to whack you if you don’t do the approach correctly.”

The researchers decided early on that a chase vehicle needs enough autonomy to close in on a disabled satellite on its own. Even the largest satellites are too distant for ground-based tracking stations to track their attitude with any precision. A chaser, perhaps equipped with navigation cameras, lidar, and other sensors, will need to do the job in real time.

“The tumbling motion of a satellite can be quite complex,” says Roberto Lampariello, the principal investigator on the project at the German Aerospace Center, or DLR. “And if you want to be sure you are not going to collide with any appendages while approaching the mating point, having an autonomous method of guidance is, I think, very attractive.”

The Astrobee tests on the space station showed that it can be done, at least in principle. Each Astrobee robot is a cube, about 30 centimeters on a side, with navigation cameras, compressed-air thrusters, and Snapdragon processors much like what you would find in a smartphone. For the latest test, last month, NASA astronaut Mark Vande Hei set up two Astrobees a couple of meters apart. They then took their commands from Albee on the ground. He started the test runs, with one robot tumbling and the other trying to rendezvous with it. There have been glitches; the Astrobees needed help determining their precise location relative to the station walls. But the results of the tests were promising.

A next step, say the researchers, is to determine how best for a chase spacecraft to grapple its target, which is especially difficult if it’s a piece of debris with no docking mechanism. Other plans over the years have involved big nets or lasers; TumbleDock/ROAM team members say they’re intrigued by grippers that use van der Waals forces between atoms, the kinds that help a gecko cling to a sheer surface.

The larger question is how to turn experiments like these into actual solutions to a growing, if lofty, problem. Low Earth orbit has been crowded enough, for long enough, that satellite makers add shielding to their vehicles and space agencies continuously scan the skies to prevent close calls. No space travelers have been killed, and there have only been a few cases in which satellites were actually pulverized. But the problem has become increasingly expensive and, in some cases, dangerous. SpaceX has launched 2,000 Starlink Internet satellites so far, may launch 30,000 more, and has other companies (like Amazon) racing to keep up. They see profits up there.

MIT’s Linares says that, in fact, is why it’s worth figuring out the space-junk problem. “There’s a reason why those orbits are valuable,” he says. Companies may spend billions to launch new satellites—and don’t want them threatened by old satellites.

“If your company’s benefiting from an orbit band,” he says, “then you’d probably better get someone to clean it up for you.”


Match ID: 21 Score: 5.00 source: spectrum.ieee.org age: 194 days
qualifiers: 5.00 travel(|ing)

Satellite Imagery for Everyone
Sat, 19 Feb 2022 16:00:00 +0000


Every day, satellites circling overhead capture trillions of pixels of high-resolution imagery of the surface below. In the past, this kind of information was mostly reserved for specialists in government or the military. But these days, almost anyone can use it.

That’s because the cost of sending payloads, including imaging satellites, into orbit has dropped drastically. High-resolution satellite images, which used to cost tens of thousands of dollars, now can be had for the price of a cup of coffee.

What’s more, with the recent advances in artificial intelligence, companies can more easily extract the information they need from huge digital data sets, including ones composed of satellite images. Using such images to make business decisions on the fly might seem like science fiction, but it is already happening within some industries.


This image shows are variety of blue and green hues, interwoven in a geometrically intriguing way.

These underwater sand dunes adorn the seafloor between Andros Island and the Exuma islands in the Bahamas. The turquoise to the right reflects a shallow carbonate bank, while the dark blue to the left marks the edge of a local deep called Tongue of the Ocean. This image was captured in April 2020 using the Moderate Resolution Imaging Spectroradiometer on NASA’s Terra satellite.

Joshua Stevens/NASA Earth Observatory


Here’s a brief overview of how you, too, can access this kind of information and use it to your advantage. But before you’ll be able to do that effectively, you need to learn a little about how modern satellite imagery works.

The orbits of Earth-observation satellites generally fall into one of two categories: GEO and LEO. The former is shorthand for geosynchronous equatorial orbit. GEO satellites are positioned roughly 36,000 kilometers above the equator, where they circle in sync with Earth’s rotation. Viewed from the ground, these satellites appear to be stationary, in the sense that their bearing and elevation remain constant. That’s why GEO is said to be a geostationary orbit.

Such orbits are, of course, great for communications relays—it’s what allows people to mount satellite-TV dishes on their houses in a fixed orientation. But GEO satellites are also appropriate when you want to monitor some region of Earth by capturing images over time. Because the satellites are so high up, the resolution of that imagery is quite coarse, however. So these orbits are primarily used for observation satellites designed to track changing weather conditions over broad areas.

Being stationary with respect to Earth means that GEO satellites are always within range of a downlink station, so they can send data back to Earth in minutes. This allows them to alert people to changes in weather patterns almost in real time. Most of this kind of data is made available for free by the U.S. National Oceanographic and Atmospheric Administration.


This black-and-white image shows a narrow waterway blocked by a large ship. The resolution of the image is sufficient to make out individual shipping containers on its deck, as well as the tugboats arrayed around it.

In March 2021, the container ship Ever Given ran aground, blocking the Suez Canal for six days. This satellite image of the scene, obtained using synthetic-aperture radar, shows the kind resolution that is possible with this technology.

Capella Space


The other option is LEO, which stands for low Earth orbit. Satellites placed in LEO are much closer to the ground, which allows them to obtain higher-resolution images. And the lower you can go, the better the resolution you can get. The company Planet, for example, increased the resolution of its recently completed satellite constellation, SkySat, from 72 centimeters per pixel to just 50 cm—an incredible feat—by lowering the orbits its satellites follow from 500 to 450 km and improving the image processing.

The best commercially available spatial resolution for optical imagery is 25 cm, which means that one pixel represents a 25-by-25-cm area on the ground—roughly the size of your laptop. A handful of companies capture data with 25-cm to 1-meter resolution, which is considered high to very high resolution in this industry. Some of these companies also offer data from 1- to 5-meter resolution, considered medium to high resolution. Finally, several government programs have made optical data available at 10-, 15-, 30-, and 250-meter resolutions for free with open data programs. These include NASA/U.S. Geological Survey Landsat, NASA MODIS (Moderate Resolution Imaging Spectroradiometer), and ESA Copernicus. This imagery is considered low resolution.

Because the satellites that provide the highest-resolution images are in the lowest orbits, they sense less area at once. To cover the entire planet, a satellite can be placed in a polar orbit, which takes it from pole to pole. As it travels, Earth rotates under it, so on its next pass, it will be above a different part of Earth.

Many of these satellites don’t pass directly over the poles, though. Instead, they are placed in a near-polar orbit that has been specially designed to take advantage of a subtle bit of physics. You see, the spinning Earth bulges outward slightly at the equator. That extra mass causes the orbits of satellites that are not in polar orbits to shift or (technically speaking) to precess. Satellite operators often take advantage of this phenomenon to put a satellite in what’s called a sun-synchronous orbit. Such orbits allow the repeated passes of the satellite over a given spot to take place at the same time of day. Not having the pattern of shadows shift between passes helps the people using these images to detect changes.




It usually takes 24 hours for a satellite in polar orbit to survey the entire surface of Earth. To image the whole world more frequently, satellite companies use multiple satellites, all equipped with the same sensor and following different orbits. In this way, these companies can provide more frequently updated images of a given location. For example, Maxar’s Worldview Legion constellation, launching later this year, includes six satellites.

After a satellite captures some number of images, all that data needs to be sent down to Earth and processed. The time required for that varies.

DigitalGlobe (which Maxar acquired in 2017) recently announced that it had managed to send data from a satellite down to a ground station and then store it in the cloud in less than a minute. That was possible because the image sent back was of the parking lot of the ground station, so the satellite didn’t have to travel between the collection point and where it had to be to do the data “dumping,” as this process is called.

In general, Earth-observation satellites in LEO don’t capture imagery all the time—they do that only when they are above an area of special interest. That’s because these satellites are limited to how much data they can send at one time. Typically, they can transmit data for only 10 minutes or so before they get out of range of a ground station. And they cannot record more data than they’ll have time to dump.

Currently, ground stations are located mostly near the poles, the most visited areas in polar orbits. But we can soon expect distances to the nearest ground station to shorten because both Amazon and Microsoft have announced intentions to build large networks of ground stations located all over the world. As it turns out, hosting the terabytes of satellite data that are collected daily is big business for these companies, which sell their cloud services (Amazon Web Services and Microsoft’s Azure) to satellite operators.

For now, if you are looking for imagery of an area far from a ground station, expect a significant delay—maybe hours—between capture and transmission of the data. The data will then have to be processed, which adds yet more time. The fastest providers currently make their data available within 48 hours of capture, but not all can manage that. While it is possible, under ideal weather conditions, for a commercial entity to request a new capture and get the data it needs delivered the same week, such quick turnaround times are still considered cutting edge.


The best commercially available spatial resolution is 25 centimeters for optical imagery, which means that one pixel represents something roughly the size of your laptop.


I’ve been using the word “imagery,” but it’s important to note that satellites do not capture images the same way ordinary cameras do. The optical sensors in satellites are calibrated to measure reflectance over specific bands of the electromagnetic spectrum. This could mean they record how much red, green, and blue light is reflected from different parts of the ground. The satellite operator will then apply a variety of adjustments to correct colors, combine adjacent images, and account for parallax, forming what’s called a true-color composite image, which looks pretty much like what you would expect to get from a good camera floating high in the sky and pointed directly down.

Imaging satellites can also capture data outside of the visible-light spectrum. The near-infrared band is widely used in agriculture, for example, because these images help farmers gauge the health of their crops. This band can also be used to detect soil moisture and a variety of other ground features that would otherwise be hard to determine.

Longer-wavelength “thermal” IR does a good job of penetrating smoke and picking up heat sources, making it useful for wildfire monitoring. And synthetic-aperture radar satellites, which I discuss in greater detail below, are becoming more common because the images they produce aren’t affected by clouds and don’t require the sun for illumination.

You might wonder whether aerial imagery, say, from a drone, wouldn’t work at least as well as satellite data. Sometimes it can. But for many situations, using satellites is the better strategy. Satellites can capture imagery over areas that would be difficult to access otherwise because of their remoteness, for example. Or there could be other sorts of accessibility issues: The area of interest could be in a conflict zone, on private land, or in another place that planes or drones cannot overfly.

So with satellites, organizations can easily monitor the changes taking place at various far-flung locations. Satellite imagery allows pipeline operators, for instance, to quickly identify incursions into their right-of-way zones. The company can then take steps to prevent a disastrous incident, such as someone puncturing a gas pipeline while construction is taking place nearby.


\u200bThis satellite image shows a snow-covered area. A tongue of darker material is draped over the side of a slope, impinging on a nearby developed area with buildings.

This SkySat image shows the effect of a devastating landslide that took place on 30 December 2020. Debris from that landslide destroyed buildings and killed 10 people in the Norwegian village of Ask.

SkySat/Planet



The ability to compare archived imagery with recently acquired data has helped a variety of industries. For example, insurance companies sometimes use satellite data to detect fraudulent claims (“Looks like your house had a damaged roof when you bought it…”). And financial-investment firms use satellite imagery to evaluate such things as retailers’ future profits based on parking-lot fullness or to predict crop prices before farmers report their yields for the season.

Satellite imagery provides a particularly useful way to find or monitor the location of undisclosed features or activities. Sarah Parcak of the University of Alabama, for example, uses satellite imagery to locate archaeological sites of interest. 52Impact, a consulting company in the Netherlands, identified undisclosed waste dump sites by training an algorithm to recognize their telltale spectral signature. Satellite imagery has also helped identify illegal fishing activities, fight human trafficking, monitor oil spills, get accurate reporting on COVID-19 deaths, and even investigate Uyghur internment camps in China—all situations where the primary actors couldn’t be trusted to accurately report what’s going on.

Despite these many successes, investigative reporters and nongovernmental organizations aren’t yet using satellite data regularly, perhaps because even the small cost of the imagery is a deterrent. Thankfully, some kinds of low-resolution satellite data can be had for free.

The first place to look for free satellite imagery is the Copernicus Open Access Hub and EarthExplorer. Both offer free access to a wide range of open data. The imagery is lower resolution than what you can purchase, but if the limited resolution meets your needs, why spend money?

If you require medium- or high-resolution data, you might be able to buy it directly from the relevant satellite operator. This field recently went through a period of mergers and acquisitions, leaving only a handful of providers, the big three in the West being Maxar and Planet in the United States and Airbus in Germany. There are also a few large Asian providers, such as SI Imaging Services in South Korea and Twenty First Century Aerospace Technology in Singapore. Most providers have a commercial branch, but they primarily target government buyers. And they often require large minimum purchases, which is unhelpful to companies looking to monitor hundreds of locations or fewer.

Expect the distance to the nearest ground station to shorten because both Amazon and Microsoft have announced intentions to build large networks of ground stations located all over the world.

Fortunately, approaching a satellite operator isn’t the only option. In the past five years, a cottage industry of consultants and local resellers with exclusive deals to service a certain market has sprung up. Aggregators and resellers spend years negotiating contracts with multiple providers so they can offer customers access to data sets at more attractive prices, sometimes for as little as a few dollars per image. Some companies providing geographic information systems—including Esri, L3Harris, and Safe Software—have also negotiated reselling agreements with satellite-image providers.

Traditional resellers are middlemen who will connect you with a salesperson to discuss your needs, obtain quotes from providers on your behalf, and negotiate pricing and priority schedules for image capture and sometimes also for the processing of the data. This is the case for Apollo Mapping, European Space Imaging, Geocento, LandInfo, Satellite Imaging Corp., and many more. The more innovative resellers will give you access to digital platforms where you can check whether an image you need is available from a certain archive and then order it. Examples include LandViewer from EOS and Image Hunter from Apollo Mapping.

More recently, a new crop of aggregators began offering customers the ability to programmatically access Earth-observation data sets. These companies work best for people looking to integrate such data into their own applications or workflows. These include the company I work for, SkyWatch, which provides such a service, called EarthCache. Other examples are UP42 from Airbus and Sentinel Hub from Sinergise.

While you will still need to talk with a sales rep to activate your account—most often to verify you will use the data in ways that fits the company’s terms of service and licensing agreements—once you’ve been granted access to their applications, you will be able to programmatically order archive data from one or multiple providers. SkyWatch is, however, the only aggregator allowing users to programmatically request future data to be collected (“tasking a satellite”).

While satellite imagery is fantastically abundant and easy to access today, two changes are afoot that will expand further what you can do with satellite data: faster revisits and greater use of synthetic-aperture radar (SAR).

This image shows a sprawling compound of dozens of large buildings located in a desert area.

This image shows a race-track shaped structure with a tall chimney in the middle, built in an area where the ground is a distinctly reddish hue. Satellite images have helped to reveal China’s treatment of its Muslim Uyghur minority. About a million Uyghurs (and other ethnic minorities) have been interned in prisons or camps like the one shown here [top], which lies to the east of the city of Ürümqi, the capital of China’s Xinjiang Uyghur Autonomous Region. Another satellite image [bottom] shows the characteristic oval shape of a fixed-chimney Bull’s trench kiln, a type widely used for manufacturing bricks in southern Asia. This one is located in Pakistan’s Punjab province. This design poses environmental concerns because of the sooty air pollution it generates, and such kilns have also been associated with human-rights abuses.Top: CNES/Airbus/Google Earth; Bottom: Maxar Technologies/Google Earth

The first of these developments is not surprising. As more Earth-observation satellites are put into orbit, more images will be taken, more often. So how frequently a given area is imaged by a satellite will increase. Right now, that’s typically two or three times a week. Expect the revisit rate soon to become several times a day. This won’t entirely address the challenge of clouds obscuring what you want to view, but it will help.

The second development is more subtle. Data from the two satellites of the European Space Agency’s Sentinel-1 SAR mission, available at no cost, has enabled companies to dabble in SAR over the last few years.

With SAR, the satellite beams radio waves down and measures the return signals bouncing off the surface. It does that continually, and clever processing is used to turn that data into images. The use of radio allows these satellites to see through clouds and to collect measurements day and night. Depending on the radar band that’s employed, SAR imagery can be used to judge material properties, moisture content, precise movements, and elevation.

As more companies get familiar with such data sets, there will no doubt be a growing demand for satellite SAR imagery, which has been widely used by the military since the 1970s. But it’s just now starting to appear in commercial products. You can expect those offerings to grow dramatically, though.

Indeed, a large portion of the money being invested in this industry is currently going to fund large SAR constellations, including those of Capella Space, Iceye, Synspective, XpressSAR, and others. The market is going to get crowded fast, which is great news for customers. It means they will be able to obtain high-resolution SAR images of the place they’re interested in, taken every hour (or less), day or night, cloudy or clear.

People will no doubt figure out wonderful new ways to employ this information, so the more folks who have access to it, the better. This is something my colleagues at SkyWatch and I deeply believe, and it’s why we’ve made it our mission to help democratize access to satellite imagery.

One day in the not-so-distant future, Earth-observation satellite data might become as ubiquitous as GPS, another satellite technology first used only by the military. Imagine, for example, being able to take out your phone and say something like, “Show me this morning’s soil-moisture map for Grover’s Corners High; I want to see whether the baseball fields are still soggy.”

This article appears in the March 2022 print issue as “A Boom with a View.”

Editor's note: The original version of this article incorrectly stated that Maxar's Worldview Legion constellation launched last year.


Match ID: 22 Score: 5.00 source: spectrum.ieee.org age: 217 days
qualifiers: 5.00 travel(|ing)

Following the Money in the Air-Taxi Craze
Tue, 08 Feb 2022 15:04:00 +0000


When entrepreneur JoeBen Bevirt launched Joby Aviation 12 years ago, it was just one of a slew of offbeat tech projects at his Sproutwerx ranch in the Santa Cruz mountains. Today, Joby has more than 1,000 employees and it’s backed by close to US $2 billion in investments, including $400 million from Toyota Motor Corporation along with big infusions from Uber and JetBlue.

Having raked in perhaps 30 percent of all the money invested in electrically-powered vertical takeoff and landing (eVTOL) aircraft so far, Joby is the colossus in an emerging class of startups working on these radical, battery-powered commercial flyers. All told, at least 250 companies worldwide are angling to revolutionize transportation in and around cities with a new category of aviation, called urban air mobility or advanced air mobility. With Joby at the apex, the category’s top seven companies together have hauled in more than $5 billion in funding—a figure that doesn’t include private firms, whose finances haven’t been disclosed.

But with some of these companies pledging to start commercial operations in 2024, there is no clear answer to a fundamental question: Are we on the verge of a stunning revolution in urban transportation, or are we witnessing, as aviation analyst Richard Aboulafia puts it, the “mother of all aerospace bubbles”?

Even by the standards of big-money tech investment, the vision is giddily audacious. During rush hour, the skies over a large city, such as Dubai or Madrid or Los Angeles, would swarm with hundreds, and eventually thousands, of eVTOL “air taxis.” Each would seat between one and perhaps half a dozen passengers, and would, eventually, be autonomous. Hailing a ride would be no more complicated than scheduling a trip on a ride-sharing app.

“We’re going to have to get the consumer used to thinking about flying in a small aircraft without a pilot on board. I have reservations about the general public’s willingness to accept that vision.”
—Laurie Garrow, Georgia Tech

And somehow, the cost would be no greater, either. In a discussion hosted by the Washington Post last July, Bevirt declared, “Our initial price point would be comparable to the cost of a taxi or an Uber, but our target is to move quickly down to the cost of what it costs you to drive your own car. And we believe that's the critical unlock to making this transformative to the world and for people’s daily lives.” Asked to put some dollar figures on his projection, Bevirt said, “Our goal is to launch this service [in 2024] at an average price of around $3 a mile and to move that down below $1 a mile over time.” The cost of an Uber varies by city and time of day, but it’s usually between $1 and $2 per mile, not including fees.

Industry analysts tend to have more restrained expectations. With the notable exception of China, they suggest, limited commercial flights will begin with eVTOL aircraft flown by human pilots, a phase that is expected to last six to eight years at least. Costs will be similar to those of helicopter trips, which tend to be in the range of $6 to $10 per mile or more. Of the 250+ startups in the field, only three—Kittyhawk, Wisk Aero (a joint venture of Kittyhawk and Boeing), and Ehang—plan to go straight to full autonomy without a preliminary phase involving pilots, says Chris Anderson, Chief Operating Officer at Kittyhawk.

To some, the autonomy issue is the heart of whether this entire enterprise can succeed economically. “When you figure in autonomy, you go from $3 a mile to 50 cents a mile,” says Anderson, citing studies done by his company. “You can’t do that with a pilot in the seat.”

Laurie A. Garrow, a professor at the Georgia Institute of Technology, agrees. “For the large-scale vision, autonomy will be critical,” she says. “In order to get to the vision that people have, where this is a ubiquitous mode of transportation with a high market share, the only way to get that is by… eliminating the pilot.” Garrow, a civil engineer who co-directs the university’s Center for Urban and Regional Air Mobility, adds that autonomy presents challenges beyond technology: “We’re going to have to get the consumer used to thinking about flying in a small aircraft without a pilot on board. I have reservations about the general public’s willingness to accept that vision, especially early on.”

“The technical problems are, if not solved, then solvable. The main limiters are laws and regulations.”
—Chris Anderson, COO, Kittyhawk

Some analysts have much more fundamental doubts. Aboulafia, managing director at the consultancy AeroDynamic Advisory, says the figures simply don’t add up. eVTOL startups are counting on mass-manufacturing techniques to reduce the costs of these exotic aircraft, but such techniques have never been applied to producing aircraft on the scale specified in the projections. Even the anticipated lower operating costs, Aboulafia adds, won’t compensate. “If I started a car service here in Washington, D.C., using Rolls Royces, you’d think I was out of my mind, right?,” he asks. “But if I put batteries in those Rolls Royces, would you think I was any less crazy?”

What everyone agrees on is that achieving even a modest amount of success for eVTOLs will require surmounting entire categories of challenges, including regulations and certification, technology development, and the operational considerations of safely flying large numbers of aircraft in a small airspace.

To some, certification will be the highest hurdle. “The technical problems are, if not solved, then solvable,” says Anderson. “The main limiters are laws and regulations.”

There are dozens of aviation certification agencies in the world. But the three most important ones for these new aircraft are the Federal Aviation Administration (FAA) in the U.S., the European Union Aviation Safety Agency (EASA), and the Civil Aviation Administration of China (CAAC). Of the three, the FAA is considered the most challenging, for several reasons. One is that, to deal with eVTOLs, the agency has chosen to adapt its existing certification rules. That gives some observers pause, because the FAA does not have a body of knowledge and experience for certifying aircraft that fly by means of battery systems and electric motors. The EASA, on the other hand, has created an entirely new set of regulations tailored for eVTOL aircraft and related technology, according to Erin Rivera, senior associate for regulatory affairs at Lilium.

To clear an aircraft for commercial flight, the FAA actually requires three certifications: one for the aircraft itself, one for its operations, and one for its manufacturing. For the aircraft, the agency designates different categories, or “parts,” for different kinds of fliers. For eVTOLs (other than multicopters), the applicable category seems to be Title 14 Code of Federal Regulations, Part 23, which covers “normal, utility, acrobatic, and commuter category airplanes.” The certification process itself is performance based, meaning that the FAA establishes performance criteria that an aircraft must meet, but does not specify how it must meet them.

Because eVTOLs are so novel, the FAA is expected to lean on industry-developed standards referred to as Means of Compliance (MOC). The proposed MOCs must be acceptable to the FAA. Through a certification scheme known as the “issue paper process,” the applicant begins by submitting what’s known as a G1 proposal, which specifies the applicable certification standards and special conditions that must be met to achieve certification. The FAA reviews and then either approves or rejects the proposal. If it’s rejected, the applicant revises the proposal to address the FAA’s concerns and tries again.

“If very high levels of automation are critical to scaling, that will be very difficult to certify. How do you certify all the algorithms?”
—Matt Metcalfe, Deloitte Consulting

Some participants are wary. When he was the chief executive of drone maker 3D Robotics, Anderson participated in an analogous experiment in which the FAA had pledged to work more closely with industry to expedite certification of drone aircraft such as multicopters. “That was five years ago, and none of the drones have been certified,” Anderson points out. “It was supposed to be agile and streamlined, and it has been anything but.”

Nobody knows how many eVTOL startups have started the certification process with the FAA, although a good guess seems to be one or two dozen. Joby is furthest along in the process, according to Mark Moore, CEO of Whisper Aero, a maker of advanced electric propulsor systems in Crossville, Tenn. The G1 certification proposals are not public, but when the FAA accepts one (presumably Joby’s), it will become available through the U.S. Federal Register for public comment. Observers expect that to happen any day now.

This certification phase of piloted aircraft is fraught with unknowns because of the novelty of the eVTOL craft themselves. But experts say a greater challenge lies ahead, when manufacturers seek to certify the vehicles for autonomous flight. “If very high levels of automation are critical to scaling, that will be very difficult to certify,” says Matt Metcalfe, a managing director in Deloitte Consulting's Future of Mobility and Aviation practice. “That’s a real challenge, because it’s so complicated. How do you certify all the algorithms?”

“It’s a matter of, how do you ensure that autonomous technology is going to be as safe as a pilot?,” says an executive at one of the startups. “How do you certify that it’s always going to be able to do what it says? With true autonomous technology, the system itself can make an undetermined number of decisions, within its programming. And the way the current certification regulations work, is that they want to be able to know the inputs and outcome of every decision that the aircraft system makes. With a fully autonomous system, you can’t do that.”

Perhaps surprisingly, most experts contacted for this story agreed with Kittyhawk's Anderson that the technical challenges of building the aircraft themselves are solvable. Even autonomy—certification challenges aside—is within reach, most say. The Chinese company EHang has already offered fully autonomous, trial flights of its EH216 multicopter to tourists in the northeastern port city of Yantai and is now building a flight hub in its home city of Guangzhou. Wisk, Kittyhawk, Joby, and other companies have collectively conducted thousands of flights that were at least partially autonomous, without a pilot on board.

Experts foresee eVTOLs largely replacing helicopters for niche applications. There’s less agreement on whether middle-class people will ever be routinely whisked around cities for pennies a mile.

A more imposing challenge, and one likely to determine whether the grand vision of urban air mobility comes to pass, is whether municipal and aviation authorities can solve the challenges of integrating large numbers of eVTOLs into the airspace over major cities. Some of these challenges are, like the aircraft themselves, totally new. For example, most viable scenarios require the construction of “vertiports” in and around cities. These would be like mini airports where the eVTOLs would take off and land, be recharged, and take on and discharge passengers. Right now, it’s not clear who would pay for these. “Manufacturers probably won’t have the money to do it,” says Metcalfe at Deloitte.

As Georgia Tech's Garrow sees it, “vertiports may be one of the greatest constraints on scalability of UAM.” Vertiports, she explains, will be the “pinch points,” because at urban facilities, space will likely be limited to accommodating several aircraft at most. And yet at such a facility, room will be needed during rush hours to accommodate dozens of aircraft needing to land, be charged, take on passengers, and take off. “So the scalability of operations at the vertiports, and the amount of land space required to do that, are going to be two major challenges.”

Despite all the challenges, Garrow, Metcalfe, and others are cautiously optimistic that air mobility will eventually become part of the urban fabric in many cities. They foresee an initial period in which the eVTOLs largely replace helicopters in a few niche applications, such as linking downtown transportation depots to airports for those who can afford it, taking tourists on sightseeing tours, and transporting organs and high-risk patients among hospitals. There’s less agreement on whether middle-class people will ever be routinely whisked around cities for pennies a mile. Even some advocates think that’s more than 10 years away, if it happens at all.

If it does happen, a few studies have predicted that travel times and greenhouse-gas and pollutant emissions could all be reduced. A 2020 study published by the U.S. National Academy of Sciences found a substantial reduction in overall energy use for transportation under “optimistic” scenarios for urban air mobility. And a 2021 study at the University of California, Berkeley, found that in the San Francisco Bay area, overall travel times could be reduced with as few as 10 vertiports. The benefits went up as the number of vertiports increased and as the transfer times at the vertiports went down. But the study also warned that “vertiport scheduling and capacity may become bottlenecks that limit the value of UAM.”

Metacalfe notes that ubiquitous modern conveniences like online shopping have already unleashed tech-based revolutions on a par with the grand vision for UAM. “We tend to look at this through the lens of today,” he says. “And that may be the wrong way to look at it. Ten years ago we never would have thought we’d be getting two or three packages a day. Similarly, the way we move people and goods in the future could be very, very different from the way we do it today.”

This article appears in the March 2022 print issue as “What’s Behind the Air-Taxi Craze.”


Match ID: 23 Score: 5.00 source: spectrum.ieee.org age: 228 days
qualifiers: 5.00 travel(|ing)

Filter efficiency 96.875 (24 matches/768 results)

ABOUT THE PROJECT

RSS Rabbit links users to publicly available RSS entries.
Vet every link before clicking! The creators accept no responsibility for the contents of these entries.

Relevant

Fresh

Convenient

Agile

CONTACT

We're not prepared to take user feedback yet. Check back soon!

rssRabbit quadric