logo RSS Rabbit quadric
News that matters, fast.
Good luck, have news.
Happy scrolling!


Date/Time of Last Update: Sat Sep 24 18:00:45 2022 UTC

********** CLIMATE **********
return to top

UK environment laws under threat in ‘deregulatory free-for-all’
Fri, 23 Sep 2022 08:22:34 GMT

Campaigners say revoking of post-Brexit protections amounts to legislative vandalism

Hundreds of Britain’s environmental laws covering water quality, sewage pollution, clean air, habitat protections and the use of pesticides are lined up for removal from UK law under a government bill.

Environmentalists accused Liz Truss’s government of reneging on a commitment made after Brexit to halt the decline of nature by 2030. They say the revoking of 570 environmental laws that were rolled over from EU law after Brexit amounts to a deregulatory free-for-all leaving the environment unprotected.

Continue reading...
Match ID: 0 Score: 40.00 source: www.theguardian.com age: 1 day
qualifiers: 40.00 air pollution

Climate Change is NSF Engineering Alliance’s Top Research Priority
Tue, 20 Sep 2022 20:00:00 +0000

Since its launch in April 2021, the Engineering Research Visioning Alliance has convened a diverse set of experts to explore three areas in which fundamental research could have the most impact: climate change; the nexus of biology and engineering; and securing critical infrastructure against hackers.

To identify priorities for each theme, ERVA—an initiative funded by the U.S. National Science Foundation—holds what are termed visioning events, wherein IEEE members and hundreds of other experts from academia, industry, and nonprofits can conceptualize bold ideas. The results are distilled into reports that identify actionable priorities for engineering research pursuit. Reports from recent visioning events are slated to be released to the public in the next few months.

IEEE is one of more than 20 professional engineering societies that have joined ERVA as affiliate partners.

Research energy storage and greenhouse gas capture solutions

Identifying technologies to address the climate crisis was ERVA’s first theme. The theme was based on results of a survey ERVA conducted last year of the engineering community about what the research priorities should be.

“The resounding answer from the 500 respondents was climate change,” says Dorota Grejner-Brzezinska, EVRA’s principal investigator. She is a vice president for knowledge enterprise at Ohio State University, in Columbus.

During the virtual visioning event in December, experts explored solar and renewable energy, carbon sequestration, water management, and geoengineering. The climate change task force released its report last month.

These are some of the research areas ERVA said should be pursued:

  • Energy storage, transmission, and critical materials. The materials include those that are nanoengineered, ones that could be used for nontraditional energy storage, and those that can extract additional energy from heat cycles.
  • Greenhouse gas capture and elimination. Research priorities included capturing and eliminating methane and nitrous oxide released in agriculture operations.
  • Resilient, energy-efficient, and healthful infrastructure. One identified priority was research to develop low-cost coatings for buildings and roads to reduce heat effects and increase self-cooling.
  • Water, ecosystem, and geoengineering assessments. The report identifies research in creating sensing, measuring, and AI models to analyze the flow of water to ensure its availability during droughts and other disruptive events caused or worsened by climate change.

“The groundwork ERVA has laid out in this report creates a blueprint for funders to invest in,” Grejner-Brzezinska says, “and catalyzes engineering research for a more secure and sustainable world. As agencies and research organizations enact legislation to reduce carbon emissions and bolster clean-energy technologies, engineering is poised to lead with research and development.”

IEEE is developing a strategy to guide the organization’s response to the global threat.

Use biology and engineering to interrupt the transfer of viruses

A virtual visioning event on Leveraging Biology to Power Engineering Impact was held in March. The hope, as explained on the event’s website, is to transform research where biology and engineering intersect: health care and medicine, agriculture, and high tech.

“As agencies and research organizations enact legislation to reduce carbon emissions and bolster clean-energy technologies, engineering is poised to lead with research and development.”

The experts considered research directions in three areas: Use biology to inspire engineers to develop new components, adapt and adopt biological constructs beyond their original function, and create engineering systems and components that improve on biology. An example would be to interrupt the transfer of viruses from one species to another so as to reduce the spread of diseases.

The task force’s report on which research areas to pursue is scheduled to be released next month, according to Grejner-Brzezinska.

Protect infrastructure from hackers

One of today’s main engineering challenges, according to ERVA, is the protection of infrastructure against hackers and other threats. At the in-person visioning event held last month at MIT on the Engineering R&D Solutions for Unhackable Infrastructure theme, researchers discussed gaps in security technologies and looked at how to design trustworthy systems and how to build resilience into interdependent infrastructures.

ERVA describes unhackable as the ability to ensure safety, security, and trust in essential systems and services that society relies on.

The task force examined research themes related to physical infrastructure such as assets and hardware; software and algorithms; and data and communication networks. It also considered new security methods for users, operators, and security administrators to thwart cyberattacks.

Grejner-Brzezinska says the task force’s report will be released in mid-December.

Sustainable transportation networks

Planning has begun for the next visioning event, Sustainable Transportation Networks, to be held virtually on 2 and 3 November. The session is to explore innovative and sustainable transportation modes and the infrastructure networks needed to support them. Some of the areas to be discussed are green construction; longitudinal impact studies; interconnected transportation modes such as rail, marine, and air transport; and transportation equity.

Become an ERVA supporter

ERVA will convene four visioning events each year on broad engineering research themes that have the potential to solve societal challenges, Grejner-Brzezinska says. IEEE members who are experts in the fields can get involved by joining the ERVA Champions, now more than 900 strong. They are among the first to learn about upcoming visioning sessions and about openings to serve on volunteer groups such as thematic task forces, advisory boards, and standing councils. Members can sign up on the ERVA website.

“Becoming a champion is an opportunity to break out of your silos of disciplines and really come together with others in the engineering research community,” Grejner-Brzezinska says. “You can do what engineers do best: solve problems.”

Match ID: 1 Score: 25.71 source: spectrum.ieee.org age: 3 days
qualifiers: 12.86 climate change, 12.86 carbon

Interview: New UN climate chief takes the fight personally
Sat, 24 Sep 2022 13:43:02 EDT
The United Nations official now in charge of the fight to curb climate change has a personal stake in the battle to reduce emissions
Match ID: 2 Score: 15.00 source: www.washingtonpost.com age: 0 days
qualifiers: 15.00 climate change

This dash for growth represents the death of green Toryism
Sat, 24 Sep 2022 16:00:08 GMT

Boris Johnson was far more eco-conscious than recent Conservative predecessors. But this mini-budget is a reversion to type

The dash for growth by Kwasi Kwarteng means unshackling City bankers and property developers from the taxes and regulations that prevent them from paving over what’s left of Britain’s green and pleasant land.

The humble concrete mixer will be elevated to exalted status. There will be more executive homes built on greenfield sites. More distribution sheds dotted along busy A-roads. And more urban renewal of the kind that involves tearing down buildings in a plume of dust and carbon emissions to replace them with something not much better, at least not in environmental terms.

Continue reading...
Match ID: 3 Score: 15.00 source: www.theguardian.com age: 0 days
qualifiers: 15.00 carbon

Philadelphia’s Diatom Archive Is a Way, Way, Wayback Machine
Sat, 24 Sep 2022 12:00:00 +0000
A cache of phytoplankton at the Academy of Natural Sciences of Drexel University is helping researchers reconstruct historical coastlines.
Match ID: 4 Score: 15.00 source: www.wired.com age: 0 days
qualifiers: 15.00 climate change

Decarbonising the energy system by 2050 could save trillions - Oxford study
Decarbonising the energy system by 2050 could save trillions - Oxford study submitted by /u/editorijsmi
[link] [comments]

Match ID: 5 Score: 15.00 source: www.reddit.com age: 0 days
qualifiers: 15.00 carbon

Mercedes’ F1 team cut its freight emissions by 89% with biofuel switch
Fri, 23 Sep 2022 14:47:08 +0000
16 trucks used biofuels to haul between the final three European races this year.
Match ID: 6 Score: 15.00 source: arstechnica.com age: 1 day
qualifiers: 15.00 carbon

Yeti CFO Paul Carbone resigning effective Oct. 28, shares dip 3.5% premarket
Fri, 23 Sep 2022 12:06:08 GMT

Yeti Holdings Inc. said Friday that Chief Financial Officer Paul Carbone is resigning effective Oct. 28, to pursue a business opportunity that will allow him to be closer to family in Boston. The provider of outdoor products such as coolers and drinkware and backpacks has commenced a search for a replacement. Shares were down 3.5% premarket and have fallen 65% in the year to date, while the S&P 500 has fallen 21%.

Market Pulse Stories are Rapid-fire, short news bursts on stocks and markets as they move. Visit MarketWatch.com for more information on this news.

Match ID: 7 Score: 15.00 source: www.marketwatch.com age: 1 day
qualifiers: 15.00 carbon

Mini-budget fell far short of promoting low-carbon future for UK
Fri, 23 Sep 2022 12:00:08 GMT

While not devoid of green measures, Kwarteng’s announcement was more notable for what it did not include

The chancellor, Kwasi Kwarteng, has announced that the effective ban on onshore wind farms is to be lifted, and the poorest households will regain access to insulation and energy efficiency measures.

Polls show that onshore wind is popular, with more than 70% of people supporting it. Jess Ralston, a senior analyst at the Energy and Climate Intelligence Unit, said: “The ban on onshore wind has been a major anomaly in British energy policy given it’s both cheap and popular with the public. So a decision to lift the ban suggests [Kwarteng] has listened to the experts and understands building more British renewables reduces our reliance on costly gas and so brings down bills.”

Continue reading...
Match ID: 8 Score: 15.00 source: www.theguardian.com age: 1 day
qualifiers: 15.00 carbon

Climate change risk to coastal castles - English Heritage
Fri, 23 Sep 2022 00:04:19 GMT
Rising sea levels are threatening ancient castles and forts at an accelerating rate, says English Heritage.
Match ID: 9 Score: 15.00 source: www.bbc.co.uk age: 1 day
qualifiers: 15.00 climate change

Climate change: Spike in Amazon emissions linked to law enforcement
Thu, 22 Sep 2022 23:00:23 GMT
Scientists say a huge increase in deforestation in the Amazon is linked to lax law enforcement.
Match ID: 10 Score: 15.00 source: www.bbc.co.uk age: 1 day
qualifiers: 15.00 climate change

Lawns Are Dumb. But Ripping Them Out May Come With a Catch
Thu, 22 Sep 2022 12:00:00 +0000
Meticulous turf is environmentally terrible. Yet grass does have one charm: It “sweats,” helping cool the local area.
Match ID: 11 Score: 15.00 source: www.wired.com age: 2 days
qualifiers: 15.00 climate change

Europe’s Heat Waves Offer a Grim Vision of the Future
Thu, 22 Sep 2022 11:00:00 +0000
Extreme temperatures are the direct result of climate change, which means more intense heat events, wildfires, and droughts to come.
Match ID: 12 Score: 15.00 source: www.wired.com age: 2 days
qualifiers: 15.00 climate change

UN chief: 'Tax fossil fuel profits for climate damage'
Tue, 20 Sep 2022 13:30:00 GMT
Tax fossil fuel companies' profits to pay for the damage done by climate change, says UN Secretary General.
Match ID: 13 Score: 10.71 source: www.bbc.co.uk age: 4 days
qualifiers: 10.71 climate change

We Can Now Train Big Neural Networks on Small Devices
Tue, 20 Sep 2022 13:02:00 +0000

The gadgets around us are constantly learning about our lives. Smartwatches pick up on our vital signs to track our health. Home speakers listen to our conversations to recognize our voices. Smartphones play grammarian, watching what we write in order to fix our idiosyncratic typos. We appreciate these conveniences, but the information we share with our gadgets isn’t always kept between us and our electronic minders. Machine learning can require heavy hardware, so “edge” devices like phones often send raw data to central servers, which then return trained algorithms. Some people would like that training to happen locally. A new AI training method expands the training capabilities of smaller devices, potentially helping to preserve privacy.

The most powerful machine-learning systems use neural networks, complex functions filled with tunable parameters. During training, a network receives an input (such as a set of pixels), generates an output (such as the label “cat”), compares its output with the correct answer, and adjusts its parameters to do better next time. To know how to tune each of those internal knobs, the network needs to remember the effect of each one, but they regularly number in the millions or even billions. That requires a lot of memory. Training a neural network can require hundreds of times the memory called upon when merely using one (also called “inference”). In the latter case, the memory is allowed to forget what each layer of the network did as soon as it passes information to the next layer.

To reduce the memory demanded during the training phase, researchers have employed a few tricks. In one, called paging or offloading, the machine moves those activations from short-term memory to a slower but more abundant type of memory such as flash or an SD card, then brings it back when needed. In another, called rematerialization, the machine deletes the activations, then computes them again later. Previously, memory-reduction systems used one of those two tricks or, says Shishir Patil, a computer scientist at the University of California, Berkeley, and the lead author of the paper describing the innovation, they were combined using “heuristics” that are “suboptimal,” often requiring a lot of energy. The innovation reported by Patil and his collaborators formalizes the combination of paging and rematerialization.

“Taking these two techniques, combining them well into this optimization problem, and then solving it—that’s really nice,” says Jiasi Chen, a computer scientist at the University of California, Riverside, who works on edge computing but was not involved in the work.

In July, Patil presented his system, called POET (private optimal energy training), at the International Conference on Machine Learning, in Baltimore. He first gives POET a device’s technical details and information about the architecture of a neural network he wants it to train. He specifies a memory budget and a time budget. He then asks it to create a training process that minimizes energy usage. The process might decide to page certain activations that would be inefficient to recompute but rematerialize others that are simple to redo but require a lot of memory to store.

One of the keys to the breakthrough was to define the problem as a mixed integer linear programming (MILP) puzzle, a set of constraints and relationships between variables. For each device and network architecture, POET plugs its variables into Patil’s hand-crafted MILP program, then finds the optimal solution. “A main challenge is actually formulating that problem in a nice way so that you can input it into a solver,” Chen says. “So, you capture all of the realistic system dynamics, like energy, latency, and memory.”

The team tested POET on four different processors, whose RAM ranged from 32 KB to 8 GB. On each, the researchers trained three different neural network architectures: two types popular in image recognition (VGG16 and ResNet-18), plus a popular language-processing network (BERT). In many of the tests, the system could reduce memory usage by about 80 percent, without a big bump in energy use. Comparable methods couldn’t do both at the same time. According to Patil, the study showed that BERT can now be trained on the smallest devices, which was previously impossible.

“When we started off, POET was mostly a cute idea,” Patil says. Now, several companies have reached out about using it, and at least one large company has tried it in its smart speaker. One thing they like, Patil says, is that POET doesn’t reduce network precision by “quantizing,” or abbreviating, activations to save memory. So the teams that design networks don’t have to coordinate with teams that implement them in order to negotiate trade-offs between precision and memory.

Patil notes other reasons to use POET besides privacy concerns. Some devices need to train networks locally because they have low or no Internet connection. These include devices used on farms, in submarines, or in space. Other setups can benefit from the innovation because data transmission requires too much energy. POET could also make large devices—Internet servers—more memory efficient and energy efficient. But as for keeping data private, Patil says, “I guess this is very timely, right?”

Match ID: 14 Score: 10.71 source: spectrum.ieee.org age: 4 days
qualifiers: 10.71 carbon

Satellite Imagery for Everyone
Sat, 19 Feb 2022 16:00:00 +0000

Every day, satellites circling overhead capture trillions of pixels of high-resolution imagery of the surface below. In the past, this kind of information was mostly reserved for specialists in government or the military. But these days, almost anyone can use it.

That’s because the cost of sending payloads, including imaging satellites, into orbit has dropped drastically. High-resolution satellite images, which used to cost tens of thousands of dollars, now can be had for the price of a cup of coffee.

What’s more, with the recent advances in artificial intelligence, companies can more easily extract the information they need from huge digital data sets, including ones composed of satellite images. Using such images to make business decisions on the fly might seem like science fiction, but it is already happening within some industries.

This image shows are variety of blue and green hues, interwoven in a geometrically intriguing way.

These underwater sand dunes adorn the seafloor between Andros Island and the Exuma islands in the Bahamas. The turquoise to the right reflects a shallow carbonate bank, while the dark blue to the left marks the edge of a local deep called Tongue of the Ocean. This image was captured in April 2020 using the Moderate Resolution Imaging Spectroradiometer on NASA’s Terra satellite.

Joshua Stevens/NASA Earth Observatory

Here’s a brief overview of how you, too, can access this kind of information and use it to your advantage. But before you’ll be able to do that effectively, you need to learn a little about how modern satellite imagery works.

The orbits of Earth-observation satellites generally fall into one of two categories: GEO and LEO. The former is shorthand for geosynchronous equatorial orbit. GEO satellites are positioned roughly 36,000 kilometers above the equator, where they circle in sync with Earth’s rotation. Viewed from the ground, these satellites appear to be stationary, in the sense that their bearing and elevation remain constant. That’s why GEO is said to be a geostationary orbit.

Such orbits are, of course, great for communications relays—it’s what allows people to mount satellite-TV dishes on their houses in a fixed orientation. But GEO satellites are also appropriate when you want to monitor some region of Earth by capturing images over time. Because the satellites are so high up, the resolution of that imagery is quite coarse, however. So these orbits are primarily used for observation satellites designed to track changing weather conditions over broad areas.

Being stationary with respect to Earth means that GEO satellites are always within range of a downlink station, so they can send data back to Earth in minutes. This allows them to alert people to changes in weather patterns almost in real time. Most of this kind of data is made available for free by the U.S. National Oceanographic and Atmospheric Administration.

This black-and-white image shows a narrow waterway blocked by a large ship. The resolution of the image is sufficient to make out individual shipping containers on its deck, as well as the tugboats arrayed around it.

In March 2021, the container ship Ever Given ran aground, blocking the Suez Canal for six days. This satellite image of the scene, obtained using synthetic-aperture radar, shows the kind resolution that is possible with this technology.

Capella Space

The other option is LEO, which stands for low Earth orbit. Satellites placed in LEO are much closer to the ground, which allows them to obtain higher-resolution images. And the lower you can go, the better the resolution you can get. The company Planet, for example, increased the resolution of its recently completed satellite constellation, SkySat, from 72 centimeters per pixel to just 50 cm—an incredible feat—by lowering the orbits its satellites follow from 500 to 450 km and improving the image processing.

The best commercially available spatial resolution for optical imagery is 25 cm, which means that one pixel represents a 25-by-25-cm area on the ground—roughly the size of your laptop. A handful of companies capture data with 25-cm to 1-meter resolution, which is considered high to very high resolution in this industry. Some of these companies also offer data from 1- to 5-meter resolution, considered medium to high resolution. Finally, several government programs have made optical data available at 10-, 15-, 30-, and 250-meter resolutions for free with open data programs. These include NASA/U.S. Geological Survey Landsat, NASA MODIS (Moderate Resolution Imaging Spectroradiometer), and ESA Copernicus. This imagery is considered low resolution.

Because the satellites that provide the highest-resolution images are in the lowest orbits, they sense less area at once. To cover the entire planet, a satellite can be placed in a polar orbit, which takes it from pole to pole. As it travels, Earth rotates under it, so on its next pass, it will be above a different part of Earth.

Many of these satellites don’t pass directly over the poles, though. Instead, they are placed in a near-polar orbit that has been specially designed to take advantage of a subtle bit of physics. You see, the spinning Earth bulges outward slightly at the equator. That extra mass causes the orbits of satellites that are not in polar orbits to shift or (technically speaking) to precess. Satellite operators often take advantage of this phenomenon to put a satellite in what’s called a sun-synchronous orbit. Such orbits allow the repeated passes of the satellite over a given spot to take place at the same time of day. Not having the pattern of shadows shift between passes helps the people using these images to detect changes.

It usually takes 24 hours for a satellite in polar orbit to survey the entire surface of Earth. To image the whole world more frequently, satellite companies use multiple satellites, all equipped with the same sensor and following different orbits. In this way, these companies can provide more frequently updated images of a given location. For example, Maxar’s Worldview Legion constellation, launching later this year, includes six satellites.

After a satellite captures some number of images, all that data needs to be sent down to Earth and processed. The time required for that varies.

DigitalGlobe (which Maxar acquired in 2017) recently announced that it had managed to send data from a satellite down to a ground station and then store it in the cloud in less than a minute. That was possible because the image sent back was of the parking lot of the ground station, so the satellite didn’t have to travel between the collection point and where it had to be to do the data “dumping,” as this process is called.

In general, Earth-observation satellites in LEO don’t capture imagery all the time—they do that only when they are above an area of special interest. That’s because these satellites are limited to how much data they can send at one time. Typically, they can transmit data for only 10 minutes or so before they get out of range of a ground station. And they cannot record more data than they’ll have time to dump.

Currently, ground stations are located mostly near the poles, the most visited areas in polar orbits. But we can soon expect distances to the nearest ground station to shorten because both Amazon and Microsoft have announced intentions to build large networks of ground stations located all over the world. As it turns out, hosting the terabytes of satellite data that are collected daily is big business for these companies, which sell their cloud services (Amazon Web Services and Microsoft’s Azure) to satellite operators.

For now, if you are looking for imagery of an area far from a ground station, expect a significant delay—maybe hours—between capture and transmission of the data. The data will then have to be processed, which adds yet more time. The fastest providers currently make their data available within 48 hours of capture, but not all can manage that. While it is possible, under ideal weather conditions, for a commercial entity to request a new capture and get the data it needs delivered the same week, such quick turnaround times are still considered cutting edge.

The best commercially available spatial resolution is 25 centimeters for optical imagery, which means that one pixel represents something roughly the size of your laptop.

I’ve been using the word “imagery,” but it’s important to note that satellites do not capture images the same way ordinary cameras do. The optical sensors in satellites are calibrated to measure reflectance over specific bands of the electromagnetic spectrum. This could mean they record how much red, green, and blue light is reflected from different parts of the ground. The satellite operator will then apply a variety of adjustments to correct colors, combine adjacent images, and account for parallax, forming what’s called a true-color composite image, which looks pretty much like what you would expect to get from a good camera floating high in the sky and pointed directly down.

Imaging satellites can also capture data outside of the visible-light spectrum. The near-infrared band is widely used in agriculture, for example, because these images help farmers gauge the health of their crops. This band can also be used to detect soil moisture and a variety of other ground features that would otherwise be hard to determine.

Longer-wavelength “thermal” IR does a good job of penetrating smoke and picking up heat sources, making it useful for wildfire monitoring. And synthetic-aperture radar satellites, which I discuss in greater detail below, are becoming more common because the images they produce aren’t affected by clouds and don’t require the sun for illumination.

You might wonder whether aerial imagery, say, from a drone, wouldn’t work at least as well as satellite data. Sometimes it can. But for many situations, using satellites is the better strategy. Satellites can capture imagery over areas that would be difficult to access otherwise because of their remoteness, for example. Or there could be other sorts of accessibility issues: The area of interest could be in a conflict zone, on private land, or in another place that planes or drones cannot overfly.

So with satellites, organizations can easily monitor the changes taking place at various far-flung locations. Satellite imagery allows pipeline operators, for instance, to quickly identify incursions into their right-of-way zones. The company can then take steps to prevent a disastrous incident, such as someone puncturing a gas pipeline while construction is taking place nearby.

\u200bThis satellite image shows a snow-covered area. A tongue of darker material is draped over the side of a slope, impinging on a nearby developed area with buildings.

This SkySat image shows the effect of a devastating landslide that took place on 30 December 2020. Debris from that landslide destroyed buildings and killed 10 people in the Norwegian village of Ask.


The ability to compare archived imagery with recently acquired data has helped a variety of industries. For example, insurance companies sometimes use satellite data to detect fraudulent claims (“Looks like your house had a damaged roof when you bought it…”). And financial-investment firms use satellite imagery to evaluate such things as retailers’ future profits based on parking-lot fullness or to predict crop prices before farmers report their yields for the season.

Satellite imagery provides a particularly useful way to find or monitor the location of undisclosed features or activities. Sarah Parcak of the University of Alabama, for example, uses satellite imagery to locate archaeological sites of interest. 52Impact, a consulting company in the Netherlands, identified undisclosed waste dump sites by training an algorithm to recognize their telltale spectral signature. Satellite imagery has also helped identify illegal fishing activities, fight human trafficking, monitor oil spills, get accurate reporting on COVID-19 deaths, and even investigate Uyghur internment camps in China—all situations where the primary actors couldn’t be trusted to accurately report what’s going on.

Despite these many successes, investigative reporters and nongovernmental organizations aren’t yet using satellite data regularly, perhaps because even the small cost of the imagery is a deterrent. Thankfully, some kinds of low-resolution satellite data can be had for free.

The first place to look for free satellite imagery is the Copernicus Open Access Hub and EarthExplorer. Both offer free access to a wide range of open data. The imagery is lower resolution than what you can purchase, but if the limited resolution meets your needs, why spend money?

If you require medium- or high-resolution data, you might be able to buy it directly from the relevant satellite operator. This field recently went through a period of mergers and acquisitions, leaving only a handful of providers, the big three in the West being Maxar and Planet in the United States and Airbus in Germany. There are also a few large Asian providers, such as SI Imaging Services in South Korea and Twenty First Century Aerospace Technology in Singapore. Most providers have a commercial branch, but they primarily target government buyers. And they often require large minimum purchases, which is unhelpful to companies looking to monitor hundreds of locations or fewer.

Expect the distance to the nearest ground station to shorten because both Amazon and Microsoft have announced intentions to build large networks of ground stations located all over the world.

Fortunately, approaching a satellite operator isn’t the only option. In the past five years, a cottage industry of consultants and local resellers with exclusive deals to service a certain market has sprung up. Aggregators and resellers spend years negotiating contracts with multiple providers so they can offer customers access to data sets at more attractive prices, sometimes for as little as a few dollars per image. Some companies providing geographic information systems—including Esri, L3Harris, and Safe Software—have also negotiated reselling agreements with satellite-image providers.

Traditional resellers are middlemen who will connect you with a salesperson to discuss your needs, obtain quotes from providers on your behalf, and negotiate pricing and priority schedules for image capture and sometimes also for the processing of the data. This is the case for Apollo Mapping, European Space Imaging, Geocento, LandInfo, Satellite Imaging Corp., and many more. The more innovative resellers will give you access to digital platforms where you can check whether an image you need is available from a certain archive and then order it. Examples include LandViewer from EOS and Image Hunter from Apollo Mapping.

More recently, a new crop of aggregators began offering customers the ability to programmatically access Earth-observation data sets. These companies work best for people looking to integrate such data into their own applications or workflows. These include the company I work for, SkyWatch, which provides such a service, called EarthCache. Other examples are UP42 from Airbus and Sentinel Hub from Sinergise.

While you will still need to talk with a sales rep to activate your account—most often to verify you will use the data in ways that fits the company’s terms of service and licensing agreements—once you’ve been granted access to their applications, you will be able to programmatically order archive data from one or multiple providers. SkyWatch is, however, the only aggregator allowing users to programmatically request future data to be collected (“tasking a satellite”).

While satellite imagery is fantastically abundant and easy to access today, two changes are afoot that will expand further what you can do with satellite data: faster revisits and greater use of synthetic-aperture radar (SAR).

This image shows a sprawling compound of dozens of large buildings located in a desert area.

This image shows a race-track shaped structure with a tall chimney in the middle, built in an area where the ground is a distinctly reddish hue. Satellite images have helped to reveal China’s treatment of its Muslim Uyghur minority. About a million Uyghurs (and other ethnic minorities) have been interned in prisons or camps like the one shown here [top], which lies to the east of the city of Ürümqi, the capital of China’s Xinjiang Uyghur Autonomous Region. Another satellite image [bottom] shows the characteristic oval shape of a fixed-chimney Bull’s trench kiln, a type widely used for manufacturing bricks in southern Asia. This one is located in Pakistan’s Punjab province. This design poses environmental concerns because of the sooty air pollution it generates, and such kilns have also been associated with human-rights abuses.Top: CNES/Airbus/Google Earth; Bottom: Maxar Technologies/Google Earth

The first of these developments is not surprising. As more Earth-observation satellites are put into orbit, more images will be taken, more often. So how frequently a given area is imaged by a satellite will increase. Right now, that’s typically two or three times a week. Expect the revisit rate soon to become several times a day. This won’t entirely address the challenge of clouds obscuring what you want to view, but it will help.

The second development is more subtle. Data from the two satellites of the European Space Agency’s Sentinel-1 SAR mission, available at no cost, has enabled companies to dabble in SAR over the last few years.

With SAR, the satellite beams radio waves down and measures the return signals bouncing off the surface. It does that continually, and clever processing is used to turn that data into images. The use of radio allows these satellites to see through clouds and to collect measurements day and night. Depending on the radar band that’s employed, SAR imagery can be used to judge material properties, moisture content, precise movements, and elevation.

As more companies get familiar with such data sets, there will no doubt be a growing demand for satellite SAR imagery, which has been widely used by the military since the 1970s. But it’s just now starting to appear in commercial products. You can expect those offerings to grow dramatically, though.

Indeed, a large portion of the money being invested in this industry is currently going to fund large SAR constellations, including those of Capella Space, Iceye, Synspective, XpressSAR, and others. The market is going to get crowded fast, which is great news for customers. It means they will be able to obtain high-resolution SAR images of the place they’re interested in, taken every hour (or less), day or night, cloudy or clear.

People will no doubt figure out wonderful new ways to employ this information, so the more folks who have access to it, the better. This is something my colleagues at SkyWatch and I deeply believe, and it’s why we’ve made it our mission to help democratize access to satellite imagery.

One day in the not-so-distant future, Earth-observation satellite data might become as ubiquitous as GPS, another satellite technology first used only by the military. Imagine, for example, being able to take out your phone and say something like, “Show me this morning’s soil-moisture map for Grover’s Corners High; I want to see whether the baseball fields are still soggy.”

This article appears in the March 2022 print issue as “A Boom with a View.”

Editor's note: The original version of this article incorrectly stated that Maxar's Worldview Legion constellation launched last year.

Match ID: 15 Score: 7.86 source: spectrum.ieee.org age: 217 days
qualifiers: 5.71 air pollution, 2.14 carbon

Solar-to-Jet-Fuel System Readies for Takeoff
Wed, 03 Aug 2022 17:00:00 +0000

As climate change edges from crisis to emergency, the aviation sector looks set to miss its 2050 goal of net-zero emissions. In the five years preceding the pandemic, the top four U.S. airlines—American, Delta, Southwest, and United—saw a 15 percent increase in the use of jet fuel. Despite continual improvements in engine efficiencies, that number is projected to keep rising.

A glimmer of hope, however, comes from solar fuels. For the first time, scientists and engineers at the Swiss Federal Institute of Technology (ETH) in Zurich have reported a successful demonstration of an integrated fuel-production plant for solar kerosene. Using concentrated solar energy, they were able to produce kerosene from water vapor and carbon dioxide directly from air. Fuel thus produced is a drop-in alternative to fossil-derived fuels and can be used with existing storage and distribution infrastructures, and engines.

Fuels derived from synthesis gas (or syngas)—an intermediate product that is a specific mixture of carbon monoxide and hydrogen—is a known alternative to conventional, fossil-derived fuels. Syngas is produced by Fischer-Tropsch (FT) synthesis, in which chemical reactions convert carbon monoxide and water vapor into hydrocarbons. The team of researchers at ETH found that a solar-driven thermochemical method to split water and carbon dioxide using a metal oxide redox cycle can produce renewable syngas. They demonstrated the process in a rooftop solar refinery at the ETH Machine Laboratory in 2019.

Close-up of a spongy looking material Reticulated porous structure made of ceria used in the solar reactor to thermochemically split CO2 and H2O and produce syngas, a specific mixture of H2 and CO.ETH Zurich

The current pilot-scale solar tower plant was set up at the IMDEA Energy Institute in Spain. It scales up the solar reactor of the 2019 experiment by a factor of 10, says Aldo Steinfeld, an engineering professor at ETH who led the study. The fuel plant brings together three subsystems—the solar tower concentrating facility, solar reactor, and gas-to-liquid unit.

First, a heliostat field made of mirrors that rotate to follow the sun concentrates solar irradiation into a reactor mounted on top of the tower. The reactor is a cavity receiver lined with reticulated porous ceramic structures made of ceria (or cerium(IV) oxide). Within the reactor, the concentrated sunlight creates a high-temperature environment of about 1,500 °C which is hot enough to split captured carbon dioxide and water from the atmosphere to produce syngas. Finally, the syngas is processed to kerosene in the gas-to-liquid unit. A centralized control room operates the whole system.

Fuel produced using this method closes the fuel carbon cycle as it only produces as much carbon dioxide as has gone into its manufacture. “The present pilot fuel plant is still a demonstration facility for research purposes,” says Steinfeld, “but it is a fully integrated plant and uses a solar-tower configuration at a scale that is relevant for industrial implementation.”

“The solar reactor produced syngas with selectivity, purity, and quality suitable for FT synthesis,” the authors noted in their paper. They also reported good material stability for multiple consecutive cycles. They observed a value of 4.1 percent solar-to-syngas energy efficiency, which Steinfeld says is a record value for thermochemical fuel production, even though better efficiencies are required to make the technology economically competitive.

Schematic of the solar tower fuel plant.  A heliostat field concentrates solar radiation onto a solar reactor mounted on top of the solar tower. The solar reactor cosplits water and carbon dioxide and produces a mixture of molecular hydrogen and carbon monoxide, which in turn is processed to drop-in fuels such as kerosene.ETH Zurich

“The measured value of energy conversion efficiency was obtained without any implementation of heat recovery,” he says. The heat rejected during the redox cycle of the reactor accounted for more than 50 percent of the solar-energy input. “This fraction can be partially recovered via thermocline heat storage. Thermodynamic analyses indicate that sensible heat recovery could potentially boost the energy efficiency to values exceeding 20 percent.”

To do so, more work is needed to optimize the ceramic structures lining the reactor, something the ETH team is actively working on, by looking at 3D-printed structures for improved volumetric radiative absorption. “In addition, alternative material compositions, that is, perovskites or aluminates, may yield improved redox capacity, and consequently higher specific fuel output per mass of redox material,” Steinfeld adds.

The next challenge for the researchers, he says, is the scale-up of their technology for higher solar-radiative power inputs, possibly using an array of solar cavity-receiver modules on top of the solar tower.

To bring solar kerosene into the market, Steinfeld envisages a quota-based system. “Airlines and airports would be required to have a minimum share of sustainable aviation fuels in the total volume of jet fuel that they put in their aircraft,” he says. This is possible as solar kerosene can be mixed with fossil-based kerosene. This would start out small, as little as 1 or 2 percent, which would raise the total fuel costs at first, though minimally—adding “only a few euros to the cost of a typical flight,” as Steinfeld puts it

Meanwhile, rising quotas would lead to investment, and to falling costs, eventually replacing fossil-derived kerosene with solar kerosene. “By the time solar jet fuel reaches 10 to 15 percent of the total jet-fuel volume, we ought to see the costs for solar kerosene nearing those of fossil-derived kerosene,” he adds.

However, we may not have to wait too long for flights to operate solely on solar fuel. A commercial spin-off of Steinfeld’s laboratory, Synhelion, is working on commissioning the first industrial-scale solar fuel plant in 2023. The company has also collaborated with the airline SWISS to conduct a flight solely using its solar kerosene.

Match ID: 16 Score: 4.29 source: spectrum.ieee.org age: 52 days
qualifiers: 2.14 climate change, 2.14 carbon

Climate change: Pakistan floods 'likely' made worse by warming
Thu, 15 Sep 2022 22:41:45 GMT
Emissions from human activities played a role in the recent floods that have brought devastation to Pakistan.
Match ID: 17 Score: 2.14 source: www.bbc.co.uk age: 8 days
qualifiers: 2.14 climate change

Ensuring Underwater Robots Survive in Hot Tropical Waters
Thu, 15 Sep 2022 15:00:00 +0000

The hot, humid environment of tropical marine areas such as Australia’s Great Barrier Reef can wreak havoc on the marine autonomous systems. Underwater and surface MAS are used for marine monitoring, locating objects such as mines on the seafloor, and rescuing swimmers.

“Tropical conditions can cause systems to overheat or prevent high-density lithium batteries from recharging,” says Melanie Olsen, who is a project director of the Australian Institute of Marine Science’s (AIMS) ReefWorks, a technology testing and evaluation facility in northern Australia. “And the microbial and small creatures that thrive in these tropical environments grow rapidly on underwater surfaces and degrade the sensor performance and the hydrodynamics of the robotics and autonomous systems.”

Developing technology that can stand up to these conditions is part of Olsen’s job, as is supporting ReefWorks’ broader mission of helping others move their autonomous systems out of the lab. It’s essential to test these systems and collect compliance evidence to demonstrate they meet regulatory requirements and can be certified for operations, says Olsen, an IEEE senior member. But there are very few places to test marine robotics, autonomous systems, and artificial-intelligence (RAS-AI) technologies, which hampers the growth of the industry, Olsen says. “It’s difficult for RAS-AI vendors to progress from a prototype to a commercial product because the pathway to a certified system is complex.”

That’s why AIMS established ReefWorks. The facility is used to test crewed and uncrewed tropical and marine vessels as well as robots, sensors, and other innovations. “We are Australia’s—and possibly the world’s—first such testing facility in the tropics,” Olsen says. Examples of underwater and surface MAS include the ReefScan CoralAUV, which is used for marine monitoring, and the Wave Adaptive Modular Vessel, a surface vessel used for marine monitoring, locating mines and other objects on the seafloor, and rescuing swimmers.

AIMS has been testing equipment for over a decade, but this part of AIMS’s facilities opened to the public in December 2021. ReefWorks supports the entire development cycle, from digital-model validation and developmental testing to product and operational-level testing, Olsen says. Physical tests can be done at AIMS’s three marine field ranges, which offer different testing conditions. ReefWorks also has land-based facilities, plus the National Sea Simulator sensor test tank, and drone corridors between the at-sea ranges for verifying the performance of long-range marine autonomous systems.

“Our overall objective is to establish a sustainable marine autonomous systems [MAS] sector in Australia,” she says.

One of the ways ReefWorks helps its users make the most of their time on test ranges is to offer “digital twins” and virtual worlds. A digital twin is a virtual model of a real-world object, machine, or system that can be used to assess how the real-world counterpart is performing.

“Each of our test ranges is developing a digital twin,” Olsen says. “Developers will be able to conduct a test mission on the virtual range so when they get here, they can replay missions with real-time collected data, and validate their MAS digital-model performance.”

Olsen leads a team of five people and is currently recruiting another five. She expects the staff to triple in size in a few years as ReefWorks becomes more established in the region.

An IEEE senior member, Olsen is active with the IEEE Northern Australia Section. She served as the section chair in 2020 and 2021, during which time the section achieved the Region 10 Outstanding Small Section Award.

Integrating embedded AI and IOT edge computing

Before joining AIMS, Olsen spent a decade in Australia’s Department of Defence (DOD) as a lead engineer working on future technologies and maritime electronic-warfare systems.

Olsen grew up in a farming family and wasn’t really exposed to computers or engineers until an EE lecturer from James Cook University, in Australia, came to her rural high school to give a presentation. He brought along a remote-controlled quadrotor helicopter—a decade before quadcopters were commonplace.

The lecture led Olsen to pursue a bachelor’s degree in electrical, electronics, and computer systems, also from James Cook University, in Townsville. She went on to earn a master’s degree in systems engineering from Australia’s University of New South Wales, in Canberra. In 2016, Olsen took a job at AIMS as an engineering team leader in technology development.

“I’m very passionate about new technologies and seeing them integrated in the field,” she says. “During my decade at the [Australian] DOD, I grew my skills in systems engineering to solve more complex technology-integration challenges. AIMS offered me an opportunity to apply these skills to the challenges facing the tropical marine environment.”

“We are Australia’s—and possibly the world’s—first such testing facility in the tropics.”

There are many similarities between what Olsen had been doing at DOD and her role at ReefWorks. “My work at both DOD and AIMS requires an understanding of how electronic subsystems work, determining what’s viable for the use case, understanding the importance of modeling and simulation, and being able to communicate engineering terminology to an interdisciplinary team,” she says. “Both roles are all about engineering problem-solving.”

Olsen is currently working on integrating embedded AI and Internet of Things edge computing into AIMS infrastructure. “Artificial intelligence is used to increase a marine autonomous system’s capabilities,” she says. “For example, AI is used to train a MAS to navigate and avoid colliding with coral reefs, other vessels, or other objects or to allow the MAS to identify specific marine species, reef areas suitable for reseeding, and marine mines.”

IoT edge computing is used to process data closer to its point of origin. “This has the potential to speed up the decision process for vessels and operators while minimizing the communications and data bandwidth needed, which are key limitations when operating in marine northern Australia,” Olsen says.

Since GPS doesn’t work underwater, another of her team’s projects is looking for additional ways to conduct accurate geospatial positioning and control for missions that don’t require marine autonomous systems to come to the surface.

“We’re only just starting to get a feel for what marine autonomous systems can do—not just for our tropical marine waters but in general,” she says. “There are grand challenges no one can solve right now, like dealing with ocean pollution and the impacts of climate change.”

Robotics engineers needed

There’s nowhere near enough robotics engineers in the world, Olsen says. She recommends that engineering students take courses that include group projects.

“Group projects help you grow your ability to solve problems outside your knowledge or expertise,” she says. “They teach you how to work as an interdisciplinary team, who to ask for help, and where to find it.”

This article appears in the October 2022 print issue as “Melanie Olsen.”

Match ID: 18 Score: 2.14 source: spectrum.ieee.org age: 9 days
qualifiers: 2.14 climate change

MOXIE Shows How to Make Oxygen on Mars
Thu, 08 Sep 2022 15:27:59 +0000

Planning for the return journey is an integral part of the preparations for a crewed Mars mission. Astronauts will require a total mass of about 50 tonnes of rocket propellent for the ascent vehicle that will lift them off the planet’s surface, including 31 tonnes of oxygen approximately. The less popular option is for crewed missions to carry the required oxygen themselves. But scientists are optimistic that it could instead be produced from the carbon dioxide–rich Martian atmosphere itself, using a system called MOXIE.

The Mars Oxygen ISRU (In-Situ Resource Utilization) Experiment is an 18-kilogram unit housed within the Perseverance rover on Mars. The unit is “the size of a toaster,” adds Jeffrey Hoffman, professor of aerospace engineering at MIT. Its job is to electrochemically break down carbon dioxide collected from the Martian atmosphere into oxygen and carbon monoxide. It also tests the purity of the oxygen.

Between February 2021, when it arrived on Mars aboard the Perseverance, and the end of the year, MOXIE has had several successful test runs. According to a review of the system by Hoffman and colleagues, published in Science Advances, it has demonstrated its ability to produce oxygen during both night and day, when temperatures can vary by over 100 ºC. The generation and purity rates of oxygen also meet requirements to produce rocket propellent and for breathing. The authors assert that a scaled-up version of MOXIE could produce the required oxygen for lift-off as well as for the astronauts to breathe.

Next question: How to power any oxygen-producing factories that NASA can land on Mars? Perhaps via NASA’s Kilopower fission reactors?

MOXIE is a first step toward a much larger and more complex system to support the human exploration of Mars. The researchers estimate a required generation rate of 2 to 3 kilograms per hour, compared with the current MOXIE rate of 6 to 8 grams per hour, to produce enough oxygen for lift-off for a crew arriving 26 months later. “So we’re talking about a system that’s a couple of hundred times bigger than MOXIE,” Hoffman says.

They calculate this rate accounting for eight months to get to Mars, followed by some time to set up the system. “We figure you'd probably have maybe 14 months to make all the oxygen.” Further, he says, the produced oxygen would have to be liquefied to be used a rocket propellant, something the current version of MOXIE doesn’t do.

MOXIE also currently faces several design constraints because, says Hoffman, a former astronaut, “our only ride to Mars was inside the Perseverance rover.” This limited the amount of power available to operate the unit, the amount of heat they could produce, the volume and the mass.

“MOXIE does not work nearly as efficiently as a stand-alone system that was specifically designed would,” says Hoffman. Most of the time, it’s turned off. “Every time we want to make oxygen, we have to heat it up to 800 ºC, so most of the energy goes into heating it up and running the compressor, whereas in a well-designed stand-alone system, most of the energy will go into the actual electrolysis, into actually producing the oxygen.”

However, there are still many kinks to iron out for the scaling-up process. To begin with, any oxygen-producing system will need lots of power. Hoffman thinks nuclear power is the most likely option, maybe NASA’s Kilopower fission reactors. The setup and the cabling would certainly be challenging, he says. “You’re going to have to launch to all of these nuclear reactors, and of course, they’re not going to be in exactly the same place as the [other] units,” he says. "So, robotically, you’re going to have to connect to the electrical cables to bring power to the oxygen-producing unit.”

Then there is the solid oxide electrolysis units, which Hoffman points out are carefully machined systems. Fortunately, the company that makes them, OxEon, has already designed, built, and tested a full-scale unit, a hundred times bigger than the one on MOXIE. “Several of those units would be required to produce oxygen at the quantities that we need,” Hoffman says.

He also adds that at present, there is no redundancy built into MOXIE. If any part fails, the whole system dies. “If you’re counting on a system to produce oxygen for rocket propellant and for breathing, you need very high reliability, which means you’re going to need quite a few redundant units.”

Moreover, the system has to be pretty much autonomous, Hoffman says. “It has to be able to monitor itself, run itself.” For testing purposes, every time MOXIE is powered up, there is plenty of time to plan. A full-scale MOXIE system, though, would have to run continuously, and for that it has to be able to adjust automatically to changes in the Mars atmosphere, which can vary by a factor of two over a year, and between nighttime and daytime temperature differences.

Match ID: 19 Score: 2.14 source: spectrum.ieee.org age: 16 days
qualifiers: 2.14 carbon

How Pakistan floods are linked to climate change
Fri, 02 Sep 2022 13:42:00 GMT
Pakistan's geography - and its immense glaciers - make it vulnerable to climate change.
Match ID: 20 Score: 2.14 source: www.bbc.co.uk age: 22 days
qualifiers: 2.14 climate change

Climate change: 'Staggering' rate of global tree losses from fires
Wed, 17 Aug 2022 09:00:36 GMT
A report says around 16 football pitches of trees per minute were lost to wildfires in 2021.
Match ID: 21 Score: 2.14 source: www.bbc.co.uk age: 38 days
qualifiers: 2.14 climate change

Inside the Universe Machine: The Webb Space Telescope’s Staggering Vision
Wed, 06 Jul 2022 13:00:00 +0000

For a deep dive into the engineering behind the James Webb Space Telescope, see our collection of posts here.

“Build something that will absolutely, positively work.” This was the mandate from NASA for designing and building the James Webb Space Telescope—at 6.5 meters wide the largest space telescope in history. Last December, JWST launched famously and successfully to its observing station out beyond the moon. And now according to NASA, as soon as next week, the JWST will at long last begin releasing scientific images and data.

Mark Kahan, on JWST’s product integrity team, recalls NASA’s engineering challenge as a call to arms for a worldwide team of thousands that set out to create one of the most ambitious scientific instruments in human history. Kahan—chief electro-optical systems engineer at Mountain View, Calif.–based Synopsys—and many others in JWST’s “pit crew” (as he calls the team) drew hard lessons from three decades ago, having helped repair another world-class space telescope with a debilitating case of flawed optics. Of course the Hubble Space Telescope is in low Earth orbit, and so a special space-shuttle mission to install corrective optics ( as happened in 1993) was entirely possible.

Not so with the JWST.

The meticulous care NASA demanded of JWST’s designers is all the more a necessity because Webb is well out of reach of repair crews. Its mission is to study the infrared universe, and that requires shielding the telescope and its sensors from both the heat of sunlight and the infrared glow of Earth. A good place to do that without getting too far from Earth is an empty patch of interplanetary space 1.5 million kilometers away (well beyond the moon’s orbit) near a spot physicists call the second Lagrange point, or L2.

The pit crew’s job was “down at the detail level, error checking every critical aspect of the optical design,” says Kahan. Having learned the hard way from Hubble, the crew insisted that every measurement on Webb’s optics be made in at least two different ways that could be checked and cross-checked. Diagnostics were built into the process, Kahan says, so that “you could look at them to see what to kick” to resolve any discrepancies. Their work had to be done on the ground, but their tests had to assess how the telescope would work in deep space at cryogenic temperatures.

Three New Technologies for the Main Mirror

Superficially, Webb follows the design of all large reflecting telescopes. A big mirror collects light from stars, galaxies, nebulae, planets, comets, and other astronomical objects—and then focuses those photons onto a smaller secondary mirror that sends it to a third mirror that then ultimately directs the light to instruments that record images and spectra.

Webb’s 6.5-meter primary mirror is the first segmented mirror to be launched into space. All the optics had to be made on the ground at room temperature but were deployed in space and operated at 30 to 55 degrees above absolute zero. “We had to develop three new technologies” to make it work, says Lee D. Feinberg of the NASA Goddard Space Flight Center, the optical telescope element manager for Webb for the past 20 years.

The longest wavelengths that Hubble has to contend with were 2.5 micrometers, whereas Webb is built to observe infrared light that stretches to 28 μm in wavelength. Compared with Hubble, whose primary mirror is a circle of an area 4.5 square meters, “[Webb’s primary mirror] had to be 25 square meters,” says Feinberg. Webb also “needed segmented mirrors that were lightweight, and its mass was a huge consideration,” he adds. No single-component mirror that could provide the required resolution would have fit on the Ariane 5 rocket that launched JWST. That meant the mirror would have to be made in pieces, assembled, folded, secured to withstand the stress of launch, then unfolded and deployed in space to create a surface that was within tens of nanometers of the shape specified by the designers.

Images of the James Webb Space Telescope and Hubble Space Telescope to scale, compared to a human figure, who is dwarfed by their size The James Webb Space Telescope [left] and the Hubble Space Telescope side by side—with Hubble’s 2.4-meter-diameter mirror versus Webb’s array of hexagonal mirrors making a 6.5-meter-diameter light-collecting area. NASA Goddard Space Flight Center

NASA and the U.S. Air Force, which has its own interests in large lightweight space mirrors for surveillance and focusing laser energy, teamed up to develop the technology. The two agencies narrowed eight submitted proposals down to two approaches for building JWST’s mirrors: one based on low-expansion glass made of a mixture of silicon and titanium dioxides similar to that used in Hubble and the other the light but highly toxic metal beryllium. The most crucial issue came down to how well the materials could withstand temperature changes from room temperature on the ground to around 50 K in space. Beryllium won because it could fully release stress after cooling without changing its shape, and it’s not vulnerable to the cracking that can occur in glass. The final beryllium mirror was a 6.5-meter array of 18 hexagonal beryllium mirrors, each weighing about 20 kilograms. The weight per unit area of JWST’s mirror was only 10 percent of that in Hubble. A 100-nanometer layer of pure gold makes the surface reflect 98 percent of incident light from JWST’s main observing band of 0.6 to 28.5 μm. “Pure silver has slightly higher reflectivity than pure gold, but gold is more robust,” says Feinberg. A thin layer of amorphous silica protects the metal film from surface damage.

In addition, a wavefront-sensing control system keeps mirror segment surfaces aligned to within tens of nanometers. Built on the ground, the system is expected to keep mirror alignment stabilized throughout the telescope’s operational life. A backplane kept at a temperature of 35 K holds all 2.4 tonnes of the telescope and instruments rock-steady to within 32 nm while maintaining them at cryogenic temperatures during observations.

Metal superstructure of cages and supports stands on a giant platform in a warehouse-sized clean-room. A man in a cleanroom suit watches the operations. The JWST backplane, the “spine” that supports the entire hexagonal mirror structure and carries more than 2,400 kg of hardware, is readied for assembly to the rest of the telescope. NASA/Chris Gunn

Hubble’s amazing, long-exposure images of distant galaxies are possible through the use of gyroscopes and reaction wheels. The gyroscopes are used to sense unwanted rotations, and reaction wheels are used to counteract them.

But the gyroscopes used on Hubble have had a bad track record and have had to be replaced repeatedly. Only three of Hubble’s six gyros remain operational today, and NASA has devised plans for operating with one or two gyros at reduced capability. Hubble also includes reaction wheels and magnetic torquers, used to maintain its orientation when needed or to point at different parts of the sky.

Webb uses reaction wheels similarly to turn across the sky, but instead of using mechanical gyros to sense direction, it uses hemispherical resonator gyroscopes, which have no moving parts. Webb also has a small fine-steering mirror in the optical path, which can tilt over an angle of just 5 arc seconds. Those very fine adjustments of the light path into the instruments keep the telescope on target. “It’s a really wonderful way to go,” says Feinberg, adding that it compensates for small amounts of jitter without having to move the whole 6-tonne observatory.


Other optics distribute light from the fine-steering mirror among four instruments, two of which can observe simultaneously. Three instruments have sensors that observe wavelengths of 0.6 to 5 μm, which astronomers call the near-infrared. The fourth, called the Mid-InfraRed Instrument (MIRI), observes what astronomers call the mid-infrared spectrum, from 5 to 28.5 μm. Different instruments are needed because sensors and optics have limited wavelength ranges. (Optical engineers may blanch slightly at astronomers’ definitions of what constitutes the near- and mid-infrared wavelength ranges. These two groups simply have differing conventions for labeling the various regimes of the infrared spectrum.)

Mid-infrared wavelengths are crucial for observing young stars and planetary systems and the earliest galaxies, but they also pose some of the biggest engineering challenges. Namely, everything on Earth and planets out to Jupiter glow in the mid-infrared. So for JWST to observe distant astronomical objects, it must avoid recording extraneous mid-infrared noise from all the various sources inside the solar system. “I have spent my whole career building instruments for wavelengths of 5 μm and longer,” says MIRI instrument scientist Alistair Glasse of the Royal Observatory, in Edinburgh. “We’re always struggling against thermal background.”

Mountaintop telescopes can see the near-infrared, but observing the mid-infrared sky requires telescopes in space. However, the thermal radiation from Earth and its atmosphere can cloud their view, and so can the telescopes themselves unless they are cooled far below room temperature. An ample supply of liquid helium and an orbit far from Earth allowed the Spitzer Space Telescope’s primary observing mission to last for five years, but once the last of the cryogenic fluid evaporated in 2009, its observations were limited to wavelengths shorter than 5 μm.

Webb has an elaborate solar shield to block sunlight, and an orbit 1.5 million km from Earth that can keep the telescope to below 55 K, but that’s not good enough for low-noise observations at wavelengths longer than 5 μm. The near-infrared instruments operate at 40 K to minimize thermal noise. But for observations out to 28.5 μm, MIRI uses a specially developed closed-cycle, helium cryocooler to keep MIRI cooled below 7 K. “We want to have sensitivity limited by the shot noise of astronomical sources,” says Glasse. (Shot noise occurs when optical or electrical signals are so feeble that each photon or electron constitutes a detectable peak.) That will make MIRI 1,000 times as sensitive in the mid-infrared as Spitzer.

Another challenge is the limited transparency of optical materials in the mid-infrared. “We use reflective optics wherever possible,” says Glasse, but they also pose problems, he adds. “Thermal contraction is a big deal,” he says, because the instrument was made at room temperature but is used at 7 K. To keep thermal changes uniform throughout MIRI, they made the whole structure of gold-coated aluminum lest other metals cause warping.

Detectors are another problem. Webb’s near-infrared sensors use mercury cadmium telluride photodetectors with a resolution of 2,048 x 2,048 pixels. This resolution is widely used at wavelengths below 5 μm, but sensing at MIRI’s longer wavelengths required exotic detectors that are limited to offering only 1,024 x 1,024 pixels.

Glasse says commissioning “has gone incredibly well.” Although some stray light has been detected, he says, “we are fully expecting to meet all our science goals.”

NIRCam Aligns the Whole Telescope

The near-infrared detectors and optical materials used for observing at wavelengths shorter than 5 μm are much more mature than those for the mid-infrared, so the Near-Infrared Camera (NIRCam) does double duty by both recording images and aligning all the optics in the whole telescope. That alignment was the trickiest part of building the instrument, says NIRCam principal investigator Marcia Rieke of the University of Arizona.

Alignment means getting all the light collected by the primary mirror to get to the right place in the final image. That’s crucial for Webb, because it has 18 separate segments that have to overlay their images perfectly in the final image, and because all those segments were built on the ground at room temperature but operate at cryogenic temperatures in space at zero gravity. When NASA recorded a test image of a single star after Webb first opened its primary mirror, it showed 18 separate bright spots, one from each segment. When alignment was completed on 11 March, the image from NIRcam showed a single star with six spikes caused by diffraction.

Image of a star with six-pointed spikes caused by diffraction Even when performing instrumental calibration tasks, JWST couldn’t help but showcase its stunning sensitivity to the infrared sky. The central star is what telescope technicians used to align JWST’s mirrors. But notice the distant galaxies and stars that photobombed the image too!NASA/STScI

Building a separate alignment system would have added to both the weight and cost of Webb, Rieke realized, and in the original 1995 plan for the telescope she proposed designing NIRCam so it could align the telescope optics once it was up in space as well as record images. “The only real compromise was that it required NIRCam to have exquisite image quality,” says Rieke, wryly. From a scientific point, she adds, using the instrument to align the telescope optics “is great because you know you’re going to have good image quality and it’s going to be aligned with you.” Alignment might be just a tiny bit off for other instruments. In the end, it took a team at Lockheed Martin to develop the computational tools to account for all the elements of thermal expansion.

Escalating costs and delays had troubled Webb for years. But for Feinberg, “commissioning has been a magical five months.” It began with the sight of sunlight hitting the mirrors. The segmented mirror deployed smoothly, and after the near-infrared cameras cooled, the mirrors focused one star into 18 spots, then aligned them to put the spots on top of each other. “Everything had to work to get it to [focus] that well,” he says. It’s been an intense time, but for Feinberg, a veteran of the Hubble repair mission, commissioning Webb was “a piece of cake.”

NASA announced that between May 23rd and 25th, one segment of the primary mirror had been dinged by a micrometeorite bigger than the agency had expected when it analyzed the potential results of such impacts. “Things do degrade over time,” Feinberg said. But he added that Webb had been engineered to minimize damage, and NASA said the event had not affected Webb’s operation schedule.

Corrections 26-28 July 2022: The story was updated a) to reflect the fact that the Lagrange point L2 where Webb now orbits is not that of the "Earth-moon system" (as the story had originally reported) but rather the Earth-sun system
and b) to correct misstatements in the original posting about Webb's hardware for controlling its orientation.

Corrections 12 Aug. 2022: Alistair Glasse's name was incorrectly spelled in a previous version of this story, as was NIRCam (which we'd spelled as NIRcam); Webb's tertiary mirror (we'd originally reported only its primary and secondary mirrors) was also called out in this version.

This article appears in the September 2022 print issue as “Inside the Universe Machine.”

Match ID: 22 Score: 2.14 source: spectrum.ieee.org age: 80 days
qualifiers: 2.14 toxic

NASA to Industry: Let’s Develop Flight Tech to Reduce Carbon Emissions
Wed, 29 Jun 2022 14:25 EDT
NASA announced Wednesday the agency is seeking partners to develop technologies needed to shape a new generation of lower-emission, single-aisle airliners that passengers could see in airports in the 2030s.
Match ID: 23 Score: 2.14 source: www.nasa.gov age: 86 days
qualifiers: 2.14 carbon

U.N. Kills Any Plans to Use Mercury as a Rocket Propellant
Tue, 19 Apr 2022 18:00:01 +0000

A recent United Nations provision has banned the use of mercury in spacecraft propellant. Although no private company has actually used mercury propellant in a launched spacecraft, the possibility was alarming enough—and the dangers extreme enough—that the ban was enacted just a few years after one U.S.-based startup began toying with the idea. Had the company gone through with its intention to sell mercury propellant thrusters to some of the companies building massive satellite constellations over the coming decade, it would have resulted in Earth’s upper atmosphere being laced with mercury.

Mercury is a neurotoxin. It’s also bio-accumulative, which means it’s absorbed by the body at a faster rate than the body can remove it. The most common way to get mercury poisoning is through eating contaminated seafood. “It’s pretty nasty,” says Michael Bender, the international coordinator of the Zero Mercury Working Group (ZMWG). “Which is why this is one of the very few instances where the governments of the world came together pretty much unanimously and ratified a treaty.”

Bender is referring to the 2013 Minamata Convention on Mercury, a U.N. treaty named for a city in Japan whose residents suffered from mercury poisoning from a nearby chemical factory for decades. Because mercury pollutants easily find their way into the oceans and the atmosphere, it’s virtually impossible for one country to prevent mercury poisoning within its borders. “Mercury—it’s an intercontinental pollutant,” Bender says. “So it required a global treaty.”

Today, the only remaining permitted uses for mercury are in fluorescent lighting and dental amalgams, and even those are being phased out. Mercury is otherwise found as a by-product of other processes, such as the burning of coal. But then a company hit on the idea to use it as a spacecraft propellant.

In 2018, an employee at Apollo Fusion approached the Public Employees for Environmental Responsibility (PEER), a nonprofit that investigates environmental misconduct in the United States. The employee—who has remained anonymous—alleged that the Mountain View, Calif.–based space startup was planning to build and sell thrusters that used mercury propellant to multiple companies building low Earth orbit (LEO) satellite constellations.

Four industry insiders ultimately confirmed that Apollo Fusion was building thrusters that utilized mercury propellant. Apollo Fusion, which was acquired by rocket manufacturing startup Astra in June 2021, insisted that the composition of its propellant mixture should be considered confidential information. The company withdrew its plans for a mercury propellant in April 2021. Astra declined to respond to a request for comment for this story.

Apollo Fusion wasn’t the first to consider using mercury as a propellant. NASA originally tested it in the 1960s and 1970s with two Space Electric Propulsion Tests (SERT), one of which was sent into orbit in 1970. Although the tests demonstrated mercury’s effectiveness as a propellant, the same concerns over the element’s toxicity that have seen it banned in many other industries halted its use by the space agency as well.

“I think it just sort of fell off a lot of folks’ radars,” says Kevin Bell, the staff counsel for PEER. “And then somebody just resurrected the research on it and said, ‘Hey, other than the environmental impact, this was a pretty good idea.’ It would give you a competitive advantage in what I imagine is a pretty tight, competitive market.”

That’s presumably why Apollo Fusion was keen on using it in their thrusters. Apollo Fusion as a startup emerged more or less simultaneously with the rise of massive LEO constellations that use hundreds or thousands of satellites in orbits below 2,000 kilometers to provide continual low-latency coverage. Finding a slightly cheaper, more efficient propellant for one large geostationary satellite doesn’t move the needle much. But doing the same for thousands of satellites that need to be replaced every several years? That’s a much more noticeable discount.

Were it not for mercury’s extreme toxicity, it would actually make an extremely attractive propellant. Apollo Fusion wanted to use a type of ion thruster called a Hall-effect thruster. Ion thrusters strip electrons from the atoms that make up a liquid or gaseous propellant, and then an electric field pushes the resultant ions away from the spacecraft, generating a modest thrust in the opposite direction. The physics of rocket engines means that the performance of these engines increases with the mass of the ion that you can accelerate.

Mercury is heavier than either xenon or krypton, the most commonly used propellants, meaning more thrust per expelled ion. It’s also liquid at room temperature, making it efficient to store and use. And it’s cheap—there’s not a lot of competition with anyone looking to buy mercury.

Bender says that ZMWG, alongside PEER, caught wind of Apollo Fusion marketing its mercury-based thrusters to at least three companies deploying LEO constellations—One Web, Planet Labs, and SpaceX. Planet Labs, an Earth-imaging company, has at least 200 CubeSats in low Earth orbit. One Web and SpaceX, both wireless-communication providers, have many more. One Web plans to have nearly 650 satellites in orbit by the end of 2022. SpaceX already has nearly 1,500 active satellites aloft in its Starlink constellation, with an eye toward deploying as many as 30,000 satellites before its constellation is complete. Other constellations, like Amazon’s Kuiper constellation, are also planning to deploy thousands of satellites.

In 2019, a group of researchers in Italy and the United States estimated how much of the mercury used in spacecraft propellant might find its way back into Earth’s atmosphere. They figured that a hypothetical LEO constellation of 2,000 satellites, each carrying 100 kilograms of propellant, would emit 20 tonnes of mercury every year over the course of a 10-year life span. Three quarters of that mercury, the researchers suggested, would eventually wind up in the oceans.

That amounts to 1 percent of global mercury emissions from a constellation only a fraction of the size of the one planned by SpaceX alone. And if multiple constellations adopted the technology, they would represent a significant percentage of global mercury emissions—especially, the researchers warned, as other uses of mercury are phased out as planned in the years ahead.

Fortunately, it’s unlikely that any mercury propellant thrusters will even get off the ground. Prior to the fourth meeting of the Minamata Convention, Canada, the European Union, and Norway highlighted the dangers of mercury propellant, alongside ZMWG. The provision to ban mercury usage in satellites was passed on 26 March 2022.

The question now is enforcement. “Obviously, there aren’t any U.N. peacekeepers going into space to shoot down” mercury-based satellites, says Bell. But the 137 countries, including the United States, who are party to the convention have pledged to adhere to its provisions—including the propellant ban.

The United States is notable in that list because as Bender explains, it did not ratify the Minamata Convention via the U.S. Senate but instead deposited with the U.N. an instrument of acceptance. In a 7 November 2013 statement (about one month after the original Minamata Convention was adopted), the U.S. State Department said the country would be able to fulfill its obligations “under existing legislative and regulatory authority.”

Bender says the difference is “weedy” but that this appears to mean that the U.S. government has agreed to adhere to the Minamata Convention’s provisions because it already has similar laws on the books. Except there is still no existing U.S. law or regulation banning mercury propellant. For Bender, that creates some uncertainty around compliance when the provision goes into force in 2025.

Still, with a U.S. company being the first startup to toy with mercury propellant, it might be ideal to have a stronger U.S. ratification of the Minamata Convention before another company hits on the same idea. “There will always be market incentives to cut corners and do something more dangerously,” Bell says.

Update 19 April 2022: In an email, a spokesperson for Astra stated that the company's propulsion system, the Astra Spacecraft Engine, does not use mercury. The spokesperson also stated that Astra has no plans to use mercury propellant and that the company does not have anything in orbit that uses mercury.

Updated 20 April 2022 to clarify that Apollo Fusion was building thrusters that used mercury, not that they had actually used them.

Match ID: 24 Score: 2.14 source: spectrum.ieee.org age: 158 days
qualifiers: 2.14 toxic

Ahrefs vs SEMrush: Which SEO Tool Should You Use?
Tue, 01 Mar 2022 12:16:00 +0000
semrush vs ahrefs

SEMrush and Ahrefs are among the most popular tools in the SEO industry. Both companies have been in business for years and have thousands of customers per month.

If you're a professional SEO or trying to do digital marketing on your own, at some point you'll likely consider using a tool to help with your efforts. Ahrefs and SEMrush are two names that will likely appear on your shortlist.

In this guide, I'm going to help you learn more about these SEO tools and how to choose the one that's best for your purposes.

What is SEMrush?


SEMrush is a popular SEO tool with a wide range of features—it's the leading competitor research service for online marketers. SEMrush's SEO Keyword Magic tool offers over 20 billion Google-approved keywords, which are constantly updated and it's the largest keyword database.

The program was developed in 2007 as SeoQuake is a small Firefox extension


  • Most accurate keyword data: Accurate keyword search volume data is crucial for SEO and PPC campaigns by allowing you to identify what keywords are most likely to bring in big sales from ad clicks. SEMrush constantly updates its databases and provides the most accurate data.
  • Largest Keyword database: SEMrush's Keyword Magic Tool now features 20-billion keywords, providing marketers and SEO professionals the largest database of keywords.

  • All SEMrush users receive daily ranking data, mobile volume information, and the option to buy additional keywords by default with no additional payment or add-ons needed
  • Most accurate position tracking tool: This tool provides all subscribers with basic tracking capabilities, making it suitable for SEO professionals. Plus, the Position Tracking tool provides local-level data to everyone who uses the tool.
  • SEO Data Management: SEMrush makes managing your online data easy by allowing you to create visually appealing custom PDF reports, including Branded and White Label reports, report scheduling, and integration with GA, GMB, and GSC.
  • Toxic link monitoring and penalty recovery: With SEMrush, you can make a detailed analysis of toxic backlinks, toxic scores, toxic markers, and outreach to those sites.
  • Content Optimization and Creation Tools: SEMrush offers content optimization and creation tools that let you create SEO-friendly content. Some features include the SEO Writing Assistant, On-Page SEO Check, er/SEO Content Template, Content Audit, Post Tracking, Brand Monitoring.



Ahrefs is a leading SEO platform that offers a set of tools to grow your search traffic, research your competitors, and monitor your niche. The company was founded in 2010, and it has become a popular choice among SEO tools. Ahrefs has a keyword index of over 10.3 billion keywords and offers accurate and extensive backlink data updated every 15-30 minutes and it is the world's most extensive backlink index database.


  • Backlink alerts data and new keywords: Get an alert when your site is linked to or discussed in blogs, forums, comments, or when new keywords are added to a blog posting about you.
  • Intuitive interface: The intuitive design of the widget helps you see the overall health of your website and search engine ranking at a glance.
  • Site Explorer: The Site Explorer will give you an in-depth look at your site's search traffic.
  • Domain Comparison
  • Reports with charts and graphs
  • JavaScript rendering and a site audit can identify SEO issues.
  • A question explorer that provides well-crafted topic suggestions

Direct Comparisons: Ahrefs vs SEMrush

Now that you know a little more about each tool, let's take a look at how they compare. I'll analyze each tool to see how they differ in interfaces, keyword research resources, rank tracking, and competitor analysis.

User Interface

Ahrefs and SEMrush both offer comprehensive information and quick metrics regarding your website's SEO performance. However, Ahrefs takes a bit more of a hands-on approach to getting your account fully set up, whereas SEMrush's simpler dashboard can give you access to the data you need quickly.

In this section, we provide a brief overview of the elements found on each dashboard and highlight the ease with which you can complete tasks.


ahrefs interface

The Ahrefs dashboard is less cluttered than that of SEMrush, and its primary menu is at the very top of the page, with a search bar designed only for entering URLs.

Additional features of the Ahrefs platform include:

  • You can see analytics from the dashboard, including search engine rankings to domain ratings, referring domains, and backlink
  • Jumping from one tool to another is easy. You can use the Keyword Explorer to find a keyword to target and then directly track your ranking with one click.
  • The website offers a tooltip helper tool that allows you to hover your mouse over something that isn't clear and get an in-depth explanation.


semrush domain overview

When you log into the SEMrush Tool, you will find four main modules. These include information about your domains, organic keyword analysis, ad keyword, and site traffic.

You'll also find some other options like

  • A search bar allows you to enter a domain, keyword, or anything else you wish to explore.
  • A menu on the left side of the page provides quick links to relevant information, including marketing insights, projects, keyword analytics, and more.
  • The customer support resources located directly within the dashboard can be used to communicate with the support team or to learn about other resources such as webinars and blogs.
  • Detailed descriptions of every resource offered. This detail is beneficial for new marketers, who are just starting.


Both Ahrefs and SEMrush have user-friendly dashboards, but Ahrefs is less cluttered and easier to navigate. On the other hand, SEMrush offers dozens of extra tools, including access to customer support resources.

When deciding on which dashboard to use, consider what you value in the user interface, and test out both.

Rank Tracking

If you're looking to track your website's search engine ranking, rank tracking features can help. You can also use them to monitor your competitors.

Let's take a look at Ahrefs vs. SEMrush to see which tool does a better job.


ahrefs rank tracking

The Ahrefs Rank Tracker is simpler to use. Just type in the domain name and keywords you want to analyze, and it spits out a report showing you the search engine results page (SERP) ranking for each keyword you enter.

Rank Tracker looks at the ranking performance of keywords and compares them with the top rankings for those keywords. Ahrefs also offers:

You'll see metrics that help you understand your visibility, traffic, average position, and keyword difficulty.

It gives you an idea of whether a keyword would be profitable to target or not.


semrush position tracking

SEMRush offers a tool called Position Tracking. This tool is a project tool—you must set it up as a new project. Below are a few of the most popular features of the SEMrush Position Tracking tool:

All subscribers are given regular data updates and mobile search rankings upon subscribing

The platform provides opportunities to track several SERP features, including Local tracking.

Intuitive reports allow you to track statistics for the pages on your website, as well as the keywords used in those pages.

Identify pages that may be competing with each other using the Cannibalization report.


Ahrefs is a more user-friendly option. It takes seconds to enter a domain name and keywords. From there, you can quickly decide whether to proceed with that keyword or figure out how to rank better for other keywords.

SEMrush allows you to check your mobile rankings and ranking updates daily, which is something Ahrefs does not offer. SEMrush also offers social media rankings, a tool you won't find within the Ahrefs platform. Both are good which one do you like let me know in the comment.

Keyword Research

Keyword research is closely related to rank tracking, but it's used for deciding which keywords you plan on using for future content rather than those you use now.

When it comes to SEO, keyword research is the most important thing to consider when comparing the two platforms.


The Ahrefs Keyword Explorer provides you with thousands of keyword ideas and filters search results based on the chosen search engine.

Ahrefs supports several features, including:

  • It can search multiple keywords in a single search and analyze them together. At SEMrush, you also have this feature in Keyword Overview.
  • Ahrefs has a variety of keywords for different search engines, including Google, YouTube, Amazon, Bing, Yahoo, Yandex, and other search engines.
  • When you click on a keyword, you can see its search volume and keyword difficulty, but also other keywords related to it, which you didn't use.


SEMrush's Keyword Magic Tool has over 20 billion keywords for Google. You can type in any keyword you want, and a list of suggested keywords will appear.

The Keyword Magic Tool also lets you to:

  • Show performance metrics by keyword
  • Search results are based on both broad and exact keyword matches.
  • Show data like search volume, trends, keyword difficulty, and CPC.
  • Show the first 100 Google search results for any keyword.
  • Identify SERP Features and Questions related to each keyword
  • SEMrush has released a new Keyword Gap Tool that uncovers potentially useful keyword opportunities for you, including both paid and organic keywords.


Both of these tools offer keyword research features and allow users to break down complicated tasks into something that can be understood by beginners and advanced users alike.

If you're interested in keyword suggestions, SEMrush appears to have more keyword suggestions than Ahrefs does. It also continues to add new features, like the Keyword Gap tool and SERP Questions recommendations.

Competitor Analysis

Both platforms offer competitor analysis tools, eliminating the need to come up with keywords off the top of your head. Each tool is useful for finding keywords that will be useful for your competition so you know they will be valuable to you.


Ahrefs' domain comparison tool lets you compare up to five websites (your website and four competitors) side-by-side.it also shows you how your site is ranked against others with metrics such as backlinks, domain ratings, and more.

Use the Competing Domains section to see a list of your most direct competitors, and explore how many keywords matches your competitors have.

To find more information about your competitor, you can look at the Site Explorer and Content Explorer tools and type in their URL instead of yours.


SEMrush provides a variety of insights into your competitors' marketing tactics. The platform enables you to research your competitors effectively. It also offers several resources for competitor analysis including:

Traffic Analytics helps you identify where your audience comes from, how they engage with your site, what devices visitors use to view your site, and how your audiences overlap with other websites.

SEMrush's Organic Research examines your website's major competitors and shows their organic search rankings, keywords they are ranking for, and even if they are ranking for any (SERP) features and more.

The Market Explorer search field allows you to type in a domain and lists websites or articles similar to what you entered. Market Explorer also allows users to perform in-depth data analytics on These companies and markets.


SEMrush wins here because it has more tools dedicated to competitor analysis than Ahrefs. However, Ahrefs offers a lot of functionality in this area, too. It takes a combination of both tools to gain an advantage over your competition.



  • Lite Monthly: $99/month
  • Standard Monthly: $179/month
  • Annually Lite: $990/year
  • Annually Standard: $1790/year


  • Pro Plan: $119.95/month
  • Guru Plan:$229.95/month
  • Business Plan: $449.95/month

Which SEO tool should you choose for digital marketing?

When it comes to keyword data research, you will become confused about which one to choose.

Consider choosing Ahrefs if you

  • Like friendly and clean interface
  • Searching for simple keyword suggestions

  • Want to get more keywords for different search engines like Amazon, Bing, Yahoo, Yandex, Baidu, and more


Consider SEMrush if you:

  • Want more marketing and SEO features
  • Need competitor analysis tool
  • Need to keep your backlinks profile clean
  • Looking for more keyword suggestions for Google

Both tools are great. Choose the one which meets your requirements and if you have any experience using either Ahrefs or SEMrush let me know in the comment section which works well for you.



Match ID: 25 Score: 2.14 source: www.crunchhype.com age: 207 days
qualifiers: 2.14 toxic

Filter efficiency 96.615 (26 matches/768 results)

********** UNIVERSITY **********
return to top

Forget Oxbridge: St Andrews knocks top universities off perch
Sat, 24 Sep 2022 07:00:07 GMT

Latest Guardian University Guide shows leading trio are in league of their own for undergraduate courses

Oxbridge is being replaced at the apex of UK universities by “Stoxbridge” after St Andrews overtook Oxford and Cambridge at the top of the latest Guardian University Guide.

It is the first time the Fife university has been ranked highest in the Guardian’s annual guide to undergraduate courses, pushing Oxford into second and Cambridge into third.

Continue reading...
Match ID: 0 Score: 30.00 source: www.theguardian.com age: 0 days
qualifiers: 30.00 rankings

The Guardian University Guide 2023 – the rankings
Sat, 24 Sep 2022 06:59:07 GMT

Find a course at one of the top universities in the country. Our league tables rank them all subject by subject, as well as by student satisfaction, staff numbers, spending and career prospects

Continue reading...
Match ID: 1 Score: 30.00 source: www.theguardian.com age: 0 days
qualifiers: 30.00 rankings

Video Friday: Loona
Fri, 16 Sep 2022 18:19:52 +0000

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

IROS 2022: 23–27 October 2022, KYOTO, JAPAN
ANA Avatar XPRIZE Finals: 4–5 November 2022, LOS ANGELES
CoRL 2022: 14–18 December 2022, AUCKLAND, NEW ZEALAND

Enjoy today's videos!

Another robotic pet on Kickstarter, another bunting of red flags.

Let's see, we've got: "she's so playful and affectionate you'll forget she's a robot." "Everything you can dream of in a best friend and more." "Get ready to fall in love!" And that's literally like the first couple of tiles on the Kickstarter post. Look, the hardware seems fine, and there is a lot of expressiveness going on, I just wish they didn't set you up for an inevitable disappointment when after a couple of weeks it becomes apparent that yes, this is just a robotic toy, and will never be your best friend (or more).

Loona is currently on Kickstarter for about USD $300.

[ Kickstarter ]

Inspired by the flexibility and resilience of dragonfly wings, we propose a novel design for a biomimetic drone propeller called Tombo propeller. Here, we report on the design and fabrication process of this biomimetic propeller that can accommodate collisions and recover quickly, while maintaining sufficient thrust force to hover and fly.


Thanks Van!

Meet Tom, a software engineer at Boston Dynamics, as he shares insights on programming and testing the practical—and impractical—applications of robotics. Whether Spot is conducting inspections or playing an instrument, learn how we go from code on a computer to actions in the real world.

Yeah, but where do I get that awesome shirt?!

[ Boston Dynamics ]

This Ameca demo couples automated speech recognition with GPT 3 —a large language model that generates meaningful answers—the output is fed to an online TTS service which generates the voice and visemes for lip sync timing. The team at Engineered Arts Ltd. pose the questions.

"Meaningful answers."

[ Engineered Arts ]

The ANT project develops a navigation and motion control system for future walking systems for planetary exploration. After successful testing on ramps and rubble fields, the challenge of climbing rough inclines such as craters is being tackled.

[ DFKI ]

Look, if you’re going to crate-train Spot, at least put some blankets and stuffed animals in there or something.

[ Energy Robotics ]

With multitrade layout, all trades’ layouts are set down with a single pass over the floor by Dusty's FieldPrinter system. Trades experience unparalleled clarity and communication with each other, because they can see each others’ installation plans and immediately identify and resolve conflicts. Instead of fighting over the floor and pointing fingers, they start to solve problems together.

[ Dusty Robotics ]

We present QUaRTM—a novel quadcopter design capable of tilting the propellers into the forward flight direction, which reduces the drag area and therefore allows for faster, more agile, and more efficient flight.

[ HiPeRLab ]

Is there an option in the iRobot app to turn my Roomba into a cake? Because I want cake.

[ iRobot ]

Looks like SoftBank is getting into high-density robotic logistics.

[ Impress ]

GITAI S2 ground test for space debris removal. During this demonstration, a tool changer was also tested to perform several different tasks at OSAM.


Recent advances allow for the automation of food preparation in high-throughput environments, yet the successful deployment of these robots requires the planning and execution of quick, robust, and ultimately collision-free behaviors. In this work, we showcase a novel framework for modifying previously generated trajectories of robotic manipulators in highly detailed and dynamic collision environments.

[ Paper ]

The LCT Hospital in South Korea uses “Dr. LCT” for robotic-based orthopedic knee procedures. The system is based on the KUKA LBR Med robotic platform, which is ideally suited for orthopedic surgery with its seven axes, software developed specifically for medical technology, and appropriate safety measures.

[ Kuka ]

A year in review. Compilation of 2022 video highlights of the Game Changing Development (GCD) Program. The Game Changing Development Program is a part of NASA’s Space Technology Mission Directorate. The program advances space technologies that may lead to entirely new approaches for the agency’s future space missions and provide solutions to significant national needs.

[ NASA ]

Naomi Wu reviews a Diablo mobile robot (with some really cool customizations of her own), sending it out to run errands in Shenzhen during lockdown.

[ Naomi Wu ]

Roundtable discussion on how teaching automation in schools, colleges, and universities can help shape the workers of tomorrow. ABB Robotics has put together a panel of experts in this field to discuss the challenges and opportunities.

[ ABB ]

On 8 September 2022, Mario Santillo of Ford talked to robotics students as the first speaker in the Undergraduate Robotics Pathways & Careers Speaker Series, which aims to answer the question “What can I do with a robotics degree?”

[ Michigan Robotics ]

Match ID: 2 Score: 2.86 source: spectrum.ieee.org age: 7 days
qualifiers: 2.86 school

Filter efficiency 99.609 (3 matches/768 results)

********** ENTERTAINMENT **********
return to top

40 of the Best Movies on Disney+ Right Now
Fri, 23 Sep 2022 11:00:00 +0000
The best classic flicks, Marvel movies, and Star Wars sagas on Disney+.
Match ID: 0 Score: 55.00 source: www.wired.com age: 1 day
qualifiers: 35.00 (best|good|great) (show|movie), 20.00 movie

32 of the Best Shows on Disney+ Right Now
Fri, 23 Sep 2022 11:00:00 +0000
Looking for something to watch? Here are the best shows on Disney’s streaming service.
Match ID: 1 Score: 35.00 source: www.wired.com age: 1 day
qualifiers: 35.00 (best|good|great) (show|movie)

Putin’s ship is sinking fast. Will he take everyone down? | Simon Tisdall
Sat, 24 Sep 2022 17:36:39 GMT

The scale of the Kremlin’s strategic failures in Ukraine is epic – and the exploded myth of Russian power may lead to the unravelling of the regime

More than ever, Vladimir Putin resembles the captain of the Titanic: steaming full speed ahead towards disaster, deluded by inaccurate assumptions about his ship’s invincibility, and blind to darkly looming hazards.

Everything the captain thinks he knows is wrong, the modern-day treasure hunter, Brock Lovett, says in the 1997 movie. And like the Titanic’s lookouts, wrong-headed Putin does not spot the iceberg until too late. There’s no avoiding catastrophe.

Continue reading...
Match ID: 2 Score: 20.00 source: www.theguardian.com age: 0 days
qualifiers: 20.00 movie

Louise Fletcher, ‘One Flew Over The Cuckoo’s Nest’ actress, dies at 88
Sat, 24 Sep 2022 10:00:33 EDT
She was known for playing Nurse Ratched in the 1975 movie that earned her an Academy Award.
Match ID: 3 Score: 20.00 source: www.washingtonpost.com age: 0 days
qualifiers: 20.00 movie

‘An unbelievable Die Hard rip-off’: two decades of Alan Rickman’s withering film reviews
Sat, 24 Sep 2022 06:00:09 GMT

When the Harry Potter actor died in 2016, he left a trove of revealing diaries – which included some very frank critiques of movies of the time

• Read an exclusive extract from Rickman’s deliciously indiscreet diaries

It’s clear from Alan Rickman’s diaries that he never lost his passion for the screen. The pages are littered with his verdicts on the movies he loved – and hated. Here’s a small selection.

Continue reading...
Match ID: 4 Score: 20.00 source: www.theguardian.com age: 0 days
qualifiers: 20.00 movie

NASA’s DART Mission Aims to Save the World
Fri, 23 Sep 2022 15:52:53 +0000

Armageddon ruined everything. Armageddon—the 1998 movie, not the mythical battlefield—told the story of an asteroid headed straight for Earth, and a bunch of swaggering roughnecks sent in space shuttles to blow it up with a nuclear weapon.

Armageddon is big and noisy and stupid and shameless, and it’s going to be huge at the box office,” wrote Jay Carr of the Boston Globe.

Carr was right—the film was the year’s second biggest hit (after Titanic)—and ever since, scientists have had to explain, patiently, that cluttering space with radioactive debris may not be the best way to protect ourselves. NASA is now trying a slightly less dramatic approach with a robotic mission called DART—short for Double Asteroid Redirection Test. On Monday at 7:14 p.m. EDT, if all goes well, the little spacecraft will crash into an asteroid called Dimorphos, about 11 million kilometers from Earth. Dimorphos is about 160 meters across, and orbits a 780-meter asteroid, 65803 Didymos. NASA TV plans to cover it live.

DART’s end will be violent, but not blockbuster-movie-violent. Music won’t swell and girlfriends back on Earth won’t swoon. Mission managers hope the spacecraft, with a mass of about 600 kilograms, hitting at 22,000 km/h, will nudge the asteroid slightly in its orbit, just enough to prove that it’s technologically possible in case a future asteroid has Earth in its crosshairs.

“Maybe once a century or so, there’ll be an asteroid sizeable enough that we’d like to certainly know, ahead of time, if it was going to impact,” says Lindley Johnson, who has the title of planetary defense officer at NASA.

“If you just take a hair off the orbital velocity, you’ve changed the orbit of the asteroid so that what would have been impact three or four years down the road is now a complete miss.”

So take that, Hollywood! If DART succeeds, it will show there are better fuels to protect Earth than testosterone.

The risk of a comet or asteroid that wipes out civilization is really very small, but large enough that policymakers take it seriously. NASA, ordered by the U.S. Congress in 2005 to scan the inner solar system for hazards, has found nearly 900 so-called NEOs—near-Earth objects—at least a kilometer across, more than 95 percent of all in that size range that probably exist. It has plotted their orbits far into the future, and none of them stand more than a fraction of a percent chance of hitting Earth in this millennium.

An infographic showing the orientation of Didymos,  Dimorphos, DART, and LICIACube. The DART spacecraft should crash into the asteroid Dimorphos and slow it in its orbit around the larger asteroid Didymos. The LICIACube cubesat will fly in formation to take images of the impact.Johns Hopkins APL/NASA

But there are smaller NEOs, perhaps 140 meters or more in diameter, too small to end civilization but large enough to cause mass destruction if they hit a populated area. There may be 25,000 that come within 50 million km of Earth’s orbit, and NASA estimates telescopes have only found about 40 percent of them. That’s why scientists want to expand the search for them and have good ways to deal with them if necessary. DART is the first test.

NASA takes pains to say this is a low-risk mission. Didymos and Dimorphos never cross Earth’s orbit, and computer simulations show that no matter where or how hard DART hits, it cannot possibly divert either one enough to put Earth in danger. Scientists want to see if DART can alter Dimorphos’s speed by perhaps a few centimeters per second.

The DART spacecraft, a 1-meter cube with two long solar panels, is elegantly simple, equipped with a telescope called DRACO, hydrazine maneuvering thrusters, a xenon-fueled ion engine and a navigation system called SMART Nav. It was launched by a SpaceX rocket in November. About 4 hours and 90,000 km before the hoped-for impact, SMART Nav will take over control of the spacecraft, using optical images from the telescope. Didymos, the larger object, should be a point of light by then; Dimorphos, the intended target, will probably not appear as more than one pixel until about 50 minutes before impact. DART will send one image per second back to Earth, but the spacecraft is autonomous; signals from the ground, 38 light-seconds away, would be useless for steering as the ship races in.

A golden cubesat with a bright light and lines The DART spacecraft separated from its SpaceX Falcon 9 launch vehicle, 55 minutes after liftoff from Vandenberg Space Force Base, in California, 24 November 2021. In this image from the rocket, the spacecraft had not yet unfurled its solar panels.NASA

What’s more, nobody knows the shape or consistency of little Dimorphos. Is it a solid boulder or a loose cluster of rubble? Is it smooth or craggy, round or elongated? “We’re trying to hit the center,” says Evan Smith, the deputy mission systems engineer at the Johns Hopkins Applied Physics Laboratory, which is running DART. “We don’t want to overcorrect for some mountain or crater on one side that’s throwing an odd shadow or something.”

So on final approach, DART will cover 800 km without any steering. Thruster firings could blur the last images of Dimorphos’s surface, which scientists want to study. Impact should be imaged from about 50 km away by an Italian-made minisatellite, called LICIACube, which DART released two weeks ago.

“In the minutes following impact, I know everybody is going be high fiving on the engineering side,” said Tom Statler, DART’s program scientist at NASA, “but I’m going be imagining all the cool stuff that is actually going on on the asteroid, with a crater being dug and ejecta being blasted off.”

There is, of course, a possibility that DART will miss, in which case there should be enough fuel on board to allow engineers to go after a backup target. But an advantage of the Didymos-Dimorphos pair is that it should help in calculating how much effect the impact had. Telescopes on Earth (plus the Hubble and Webb space telescopes) may struggle to measure infinitesimal changes in the orbit of Dimorphos around the sun; it should be easier to see how much its orbit around Didymos is affected. The simplest measurement may be of the changing brightness of the double asteroid, as Dimorphos moves in front of or behind its partner, perhaps more quickly or slowly than it did before impact.

“We are moving an asteroid,” said Statler. “We are changing the motion of a natural celestial body in space. Humanity’s never done that before.”

Match ID: 5 Score: 20.00 source: spectrum.ieee.org age: 1 day
qualifiers: 20.00 movie

After pushing AV1 codec, Google goes after Dolby with HDR and audio standards
Thu, 22 Sep 2022 21:40:03 +0000
AV1's Alliance for Open Media wants more royalty-free standards.
Match ID: 6 Score: 20.00 source: arstechnica.com age: 1 day
qualifiers: 20.00 movie

“Don’t Worry Darling” Is So Much More Than Hollywood Gossip Fodder
Thu, 22 Sep 2022 21:03:25 +0000
Olivia Wilde’s film is deftly designed and performed—and delivers a stunning twist.
Match ID: 7 Score: 20.00 source: www.newyorker.com age: 1 day
qualifiers: 20.00 movie

“Blonde” Is “The Passion of the Christ” for Marilyn Monroe
Tue, 20 Sep 2022 22:42:17 +0000
The film has a single idea—that Monroe was a victim—and is happy to victimize her, over and over.
Match ID: 8 Score: 17.14 source: www.newyorker.com age: 3 days
qualifiers: 17.14 movie

With “The Fabelmans,” Steven Spielberg Finally Phones Home
Tue, 20 Sep 2022 10:00:00 +0000
The director’s new film is a retelling of his parents’ troubled marriage and leads the pack in the Oscar race.
Match ID: 9 Score: 14.29 source: www.newyorker.com age: 4 days
qualifiers: 14.29 movie

Nvidia’s New Chip Shows Its Muscle in AI Tests
Mon, 12 Sep 2022 14:59:24 +0000

It’s time for the “Olympics of machine learning” again, and if you’re tired of seeing Nvidia at the top of the podium over and over, too bad. At least this time, the GPU powerhouse put a new contender into the mix, its Hopper GPU, which delivered as much as 4.5 times the performance of its predecessor and is due out in a matter of months. But Hopper was not alone in making it to the podium at MLPerf Inferencing v2.1. Systems based on Qualcomm’s AI 100 also made a good showing, and there were other new chips, new types of neural networks, and even new, more realistic ways of testing them.

Before I go on, let me repeat the canned answer to “What the heck is MLPerf?”

MLPerf is a set of benchmarks agreed upon by members of the industry group MLCommons. It is the first attempt to provide apples-to-apples comparisons of how good computers are at training and executing (inferencing) neural networks. In MLPerf’s inferencing benchmarks, systems made up of combinations of CPUs and GPUs or other accelerator chips are tested on up to six neural networks that perform a variety of common functions—image classification, object detection, speech recognition, 3D medical imaging, natural-language processing, and recommendation. The networks had already been trained on a standard set of data and had to make predictions about data they had not been exposed to before.

Cartoons of a cat, people, a magnifying glass, and other symbols. This slide from Nvidia sums up the whole MLPerf effort. Six benchmarks [left] are tested on two types of computers (data center and edge) in a variety of conditions [right].Nvidia

Tested computers are categorized as intended for data centers or “the edge.” Commercially available data-center-based systems were tested under two conditions—a simulation of real data-center activity where queries arrive in bursts and “offline” activity where all the data is available at once. Computers meant to work on-site instead of in the data center—what MLPerf calls the edge, because they’re located at the edge of the network—were measured in the offline state; as if they were receiving a single stream of data, such as from a security camera; and as if they had to handle multiple streams of data, the way a car with several cameras and sensors would. In addition to testing raw performance, computers could also compete on efficiency.

The contest was further divided into a “closed” category, where everybody had to run the same “mathematically equivalent” neural networks and meet the same accuracy measures, and an “open” category, where companies could show off how modifications to the standard neural networks make their systems work better. In the contest with the most powerful computers under the most stringent conditions, the closed data-center group, computers with AI accelerator chips from four companies competed: Biren, Nvidia, Qualcomm, and Sapeon. (Intel made two entries without any accelerators, to demonstrate what its CPUs could do on their own.)

While several systems were tested on the entire suite of neural networks, most results were submitted for image recognition, with the natural-language processor BERT (short for Bidirectional Encoder Representations from Transformers) a close second, making those categories the easiest to compare. Several Nvidia-GPU-based systems were tested on the entire suite of benchmarks, but performing even one benchmark can take more than a month of work, engineers involved say.

On the image-recognition trial, startup Biren’s new chip, the BR104, performed well. An eight-accelerator computer built with the company’s partner, Inspur, blasted through 424,660 samples per second, the fourth-fastest system tested, behind a Qualcomm Cloud AI 100-based machine with 18 accelerators, and two Nvidia A100-based R&D systems from Nettrix and H3C with 20 accelerators each.

But Biren really showed its power on natural-language processing, beating all the other four-accelerator systems by at least 33 percent on the highest-accuracy version of BERT and by even bigger margins among eight-accelerator systems.

An Intel system based on two soon-to-be-released Xeon Sapphire Rapids CPUs without the aid of any accelerators was another standout, edging out a machine using two current-generation Xeons in combination with an accelerator. The difference is partly down to Sapphire Rapids’ Advanced Matrix Extensions, an accelerator worked into each of the CPU’s cores.

Sapeon presented two systems with different versions of their Sapeon X220 accelerator, testing them only on image recognition. Both handily beat the other single-accelerator computers at this, with the exception of Nvidia’s Hopper, which got through six times as much work.

A pair of vertical bar charts with six sets of bars each. Computers with multiple GPUs or other AI accelerators typically run faster than those with a single accelerator. But on a per-accelerator basis, Nvidia’s upcoming H100 pretty much crushed it.Nvidia

In fact, among systems with the same configuration, Nvidia’s Hopper topped every category. Compared to its predecessor, the A100 GPU, Hopper was at least 1.5 times and up to 4.5 times as fast on a per-accelerator basis, depending on the neural network under test. “H100 came in and really brought the thunder,” says Dave Salvator, Nvidia’s director of product marketing for accelerated cloud computing. “Our engineers knocked it out of the park.”

Hopper’s not-secret-at-all sauce is a system called the transformer engine. Transformers are a class of neural networks that include the natural-language processor in the MLPerf inferencing benchmarks, BERT. The transformer engine is meant to speed inferencing and training by adjusting the precision of the numbers computed in each layer of the neural network, using the minimum needed to reach an accurate result. This includes computing with a modified version of 8-bit floating-point numbers. (Here’s a more complete explanation of reduced-precision machine learning.)

Because these results are a first attempt at the MLPerf benchmarks, Salvator says to expect the gap between H100 and A100 to widen, as engineers discover how to get the most out of the new chips. There’s good precedence for that. Through software and other improvements, engineers have been able to speed up A100 systems continuously since its introduction in May 2020.

Salvator says to expect H100 results for MLPerf’s efficiency benchmarks in future, but for now the company is focused on seeing what kind of performance they can get out of the new chip.


On the efficiency front, Qualcomm Cloud AI 100-based machines did themselves proud, but this was in a much smaller field than the performance contest. (MLPerf representatives stressed that computers are configured differently for the efficiency tests than for the performance tests, so it’s only fair to compare the performance of systems configured to the same purpose.) On the offline image-recognition benchmark for data-center systems, Qualcomm took the top three spots in terms of the number of images they could recognize per joule expended. The contest for efficiency on BERT was much closer. Qualcomm took to the top spot for the 99-percent-accuracy version, but it lost out to an Nvidia A100 system at the 99.99-percent-accuracy task. In both cases the race was close.

The case was similar for image recognition for edge systems, with Qualcomm taking nearly all the top spots by dealing with streams of data in less than a millisecond in most cases and often using less than 0.1 joules to do it. Nvidia’s Orin chip, due out within six months, came closest to matching the Qualcomm results. Again, Nvidia was better with BERT, using less energy, though it still couldn’t match Qualcomm’s speed.


There was a lot going on in the “open” division of MLPerf, but one of the more interesting results was how companies have been showing how well and efficiently “sparse” networks perform. These take a neural network and prune it down, removing nodes that contribute little or nothing toward producing a result. The much smaller network can then, in theory, run faster and more efficiently while using less compute and memory resources.

For example, startup Moffett AI showed results for three computers using its Antoum accelerator architecture for sparse networks. Moffett tested the systems, which are intended for data-center use on image recognition and natural-language processing. At image recognition, the company’s commercially available system managed 31,678 samples per second, and its coming chip hit 95,784 samples per second. For reference, the H100 hit 95,784 samples per second, but the Nvidia machine was working on the full neural network and met a higher accuracy target.

Another sparsity-focused firm, Neural Magic, showed off software that applies sparsity algorithms to neural networks so that they run faster on commodity CPUs. Its algorithms decreased the size of a version of BERT from 1.3 gigabytes to about 10 megabytes and boosted throughput from about 10 samples per second to 1,000, the company says.

And finally, Tel Aviv-based Deci used software it calls Automated Neural Architecture Construction technology (AutoNAC) to produce a version of BERT optimized to run on an AMD CPU. The resulting network sped throughput more than sixfold using a model that was one-third the size of the reference neural network.

And More

With more than 7,400 measurements across a host of categories, there’s a lot more to unpack. Feel free to take a look yourself at MLCommons.

Match ID: 10 Score: 5.00 source: spectrum.ieee.org age: 12 days
qualifiers: 5.00 (best|good|great) (show|movie)

Fri, 09 Sep 2022 15:55:30 +0000

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

CLAWAR 2022: 12–14 September 2022, AZORES, PORTUGAL
IROS 2022: 23–27 October 2022, KYOTO, JAPAN
ANA Avatar XPrize Finals: 4–5 November 2022, LOS ANGELES
CoRL 2022: 14–18 December 2022, AUCKLAND, NEW ZEALAND

Enjoy today’s videos!

DARPA’s AdvaNced airCraft Infrastructure-Less Launch And RecoverY X-Plane program, nicknamed ANCILLARY, aims to develop and flight demonstrate critical technologies required for a leap ahead in vertical takeoff and landing (VTOL), low-weight, high-payload, and long-endurance capabilities.


Behold the tastiest robot ever, thanks to the 40 kilograms of dark chocolate that it’s made of.

[ Amaury Guichon ]

When a video features a robot operating outdoors while being pursued by a human with a laptop on a cart, you know it’s going to be some cutting-edge stuff. In this case, it’s the University of Michigan’s Cassie autonomously navigating based on directions from a hand-drawn map.

First, we show Cassie a map with a hand-drawn path, which she needs to follow. Second, she localizes herself into the OpenStreetMap, used as a topological global map. Third, she then converts the drawn path to her own understanding in the OpenStreetMap. Fourth, she determines terrain types such as sidewalks, roads, and grass. Fifth, she decides what categories she should walk on at the moment. Sixth, a multi-layered map is built. Seventh, a reactive CLF planning algorithm is guiding Cassie to walk safely without hitting obstacles. Finally, the planning signal is sent to Cassie’s 20 degree-of-freedom motion controller.

[ University of Michigan ]

Thanks, Bruce!

Apparently Indonesia drone laws are very permissive? Or they are for DJI, anyway.

[ DJI Avata ]

Waymo Co-CEO Dmitri Dolgov recently took another rider-only trip around San Francisco. Watch as the Waymo Driver reacts dynamically to other human drivers, cyclists, and pedestrians during the nearly hour-long ride.

[ Waymo ]

This capacitive sensing skin will keep you from getting whacked by a robot arm.

[ Paper ]

Dexterous Teleoperation combining shadow hand with real-time volumetric telepresence rendering in VR.

[ Extend Robotics ]

Breathtaking landscape aerial cinematography is made easy when using Skydio drone technology! Enjoy some of our favorite scenic landscape moments from the Skydio community.

[ Skydio ]

Most people think of intelligence as existing in the computer or our brain. Artificial intelligence recognizes faces, understands speech, picks movies, and corrects typos. These tasks are well-suited for computers. But when it comes to roboticists, they are all about physical tasks in the real world. And intelligence is no longer confined to the realm of the bits; the intelligent agent is a robot. Professor Matei Ciocarlie’s Robotic Manipulation and Mobility lab is embodying intelligence in robot hands to solve the problem of physical interaction in our complicated world.

[ ROAM Lab ]

In this episode of our Robot Spotlight series, we showcase a Polaris GEM electric vehicle that has been outfitted with our OutdoorNav autonomy software. Watch the video to learn how it all came together and to find out if the team was able to use the autonomy software to navigate the vehicle through a local shopping plaza and through a Starbucks drive thru.

[ Clearpath ]

Two research talks from UPenn’s GRASP lab: Nadia Figueroa on Collaborative Human-Aware Robotics, and M. Ani Hsieh on Robots for Climate, Energy, and Stability.

[ GRASP Lab ]

Match ID: 11 Score: 2.86 source: spectrum.ieee.org age: 15 days
qualifiers: 2.86 movie

How Flyback Rocket Boosters Got Off the Ground
Mon, 21 Mar 2022 20:27:59 +0000

In the popular conception of a technological breakthrough, a flash of genius is followed quickly by commercial or industrial success, public acclaim, and substantial wealth for a small group of inventors and backers. In the real world, it almost never works out that way.

Advances that seem to appear suddenly are often backed by decades of development. Consider steam engines. Starting in the second quarter of the 19th century they began powering trains, and they soon revolutionized the transportation of people and goods. But steam engines themselves had been invented at the beginning of the 18th century. For 125 years they had been used to pump water out of mines and then to power the mills of the Industrial Revolution.

Lately we’ve become accustomed to seeing rocket boosters return to Earth and then land vertically, on their tails, ready to be serviced and flown again. (Much the same majestic imagery thrilled sci-fi moviegoers in the 1950s.) Today, both SpaceX and Blue Origin are using these techniques, and a third startup, Relativity Space, is on the verge of joining them. Such reusable rocketry is already cutting the cost of access to space and, with other advances yet to come, will help make it possible for humanity to return to the moon and eventually to travel to Mars.

Vertical landings, too, have a long history, with the same ground being plowed many times by multiple research organizations. From 1993 to 1996 a booster named DCX, for Delta Clipper Experimental, took off and landed vertically eight times at White Sands Missile Range. It flew to a height of only 2,500 meters, but it successfully negotiated the very tricky dynamics of landing a vertical cylinder on its end.

The key innovations that made all this possible happened 50 or more years ago. And those in turn built upon the invention a century ago of liquid-fueled rockets that can be throttled up or down by pumping more or less fuel into a combustion chamber.

In August 1954 the Rolls-Royce Thrust Measuring Rig, also known as the “flying bedstead,” took off and landed vertically while carrying a pilot. The ungainly contraption had two downward-pointing Rolls-Royce jet engines with nozzles that allowed the pilot to vector the thrust and control the flight. By 1957 another company, Hawker Siddeley, started work on turning this idea into a vertical take-off and landing (VTOL) fighter jet. It first flew in 1967 and entered service in 1969 as the Harrier Jump Jet, with new Rolls-Royce engines specifically designed for thrust vectoring. Thrust vectoring is a critical component of control for all of today’s reusable rocket boosters.

During the 1960s another rig, also nicknamed the flying bedstead, was developed in the United States for training astronauts to land on the moon. There was a gimbaled rocket engine that always pointed directly downward, providing thrust equal to five-sixths of the vehicle and the pilot’s weight, simulating lunar gravity. The pilot then controlled the thrust and direction of another rocket engine to land the vehicle safely.

It was not all smooth flying. Neil Armstrong first flew the trainer in March 1967, but he was nearly killed in May 1968 when things went awry and he had to use the ejection seat to rocket to safety. The parachute deployed and he hit the ground just 4 seconds later. Rocket-powered vertical descent was harder than it looked.

Vertical rocket landings have a long history, with the same ground being plowed many times by multiple research organizations.

Nevertheless, between 1969 and 1972, Armstrong and then five other astronauts piloted lunar modules to vertical landings on the moon. There were no ejection seats, and these have been the only crewed rocket-powered landings on a spaceflight. All other humans lofted into space have used Earth’s atmosphere to slow down, combining heat shields with either wings or parachutes.

In the early days of Blue Origin, the company returned to the flying-bedstead approach, and its vehicle took off and landed successfully in March 2005. It was powered by four jet engines, once again from Rolls-Royce, bought secondhand from the South African Air Force. Ten years later, in November 2015, Blue Origin’s New Shepard booster reached an altitude of 100 kilometers and then landed vertically. A month later SpaceX had its first successful vertical landing of a Falcon-9 booster.

Today’s reusable, or flyback, boosters also use something called grid fins, those honeycombed panels sticking out perpendicularly from the top of a booster that guide the massive cylinder as it falls through the atmosphere unpowered. The fins have an even longer history, as they have been part of every crewed Soyuz launch since the 1960s. They guide the capsule back to Earth if there’s an abort during the climb to orbit. They were last used in October 2018 when a Soyuz failed at 50 km up. The cosmonaut and astronaut who were aboard landed safely and had a successful launch in another Soyuz five months later.

The next big accomplishment will be crewed vertical landings, 50 years after mankind's last one, on the moon. It will almost certainly happen before this decade is out.

I’m less confident that we’ll see general-purpose quantum computers and abundant electricity from nuclear fusion in that time frame. But I’m pretty sure we’ll eventually get there with both. The arc of technology development is often long. And sometimes, the longer it is, the more revolutionary it is in the end.

This article appears in the April 2022 print issue as “The Long Road to Overnight Success .”

Match ID: 12 Score: 2.86 source: spectrum.ieee.org age: 186 days
qualifiers: 2.86 movie

Most Frequently Asked Questions About NFTs(Non-Fungible Tokens)
Sun, 06 Feb 2022 10:04:00 +0000



Non-fungible tokens (NFTs) are the most popular digital assets today, capturing the attention of cryptocurrency investors, whales and people from around the world. People find it amazing that some users spend thousands or millions of dollars on a single NFT-based image of a monkey or other token, but you can simply take a screenshot for free. So here we share some freuently asked question about NFTs.

1) What is an NFT?

NFT stands for non-fungible  token, which is a cryptographic token on a blockchain with unique identification codes that distinguish it from other tokens. NFTs are unique and not interchangeable, which means no two NFTs are the same. NFTs can be a unique artwork, GIF, Images, videos, Audio album. in-game items, collectibles etc.

2) What is Blockchain?

A blockchain is a distributed digital ledger that allows for the secure storage of data. By recording any kind of information—such as bank account transactions, the ownership of Non-Fungible Tokens (NFTs), or Decentralized Finance (DeFi) smart contracts—in one place, and distributing it to many different computers, blockchains ensure that data can’t be manipulated without everyone in the system being aware.

3) What makes an NFT valuable?

The value of an NFT comes from its ability to be traded freely and securely on the blockchain, which is not possible with other current digital ownership solutionsThe NFT points to its location on the blockchain, but doesn’t necessarily contain the digital property. For example, if you replace one bitcoin with another, you will still have the same thing. If you buy a non-fungible item, such as a movie ticket, it is impossible to replace it with any other movie ticket because each ticket is unique to a specific time and place.

4) How do NFTs work?

One of the unique characteristics of non-fungible tokens (NFTs) is that they can be tokenised to create a digital certificate of ownership that can be bought, sold and traded on the blockchain. 

As with crypto-currency, records of who owns what are stored on a ledger that is maintained by thousands of computers around the world. These records can’t be forged because the whole system operates on an open-source network. 

NFTs also contain smart contracts—small computer programs that run on the blockchain—that give the artist, for example, a cut of any future sale of the token.

5) What’s the connection between NFTs and cryptocurrency?

Non-fungible tokens (NFTs) aren't cryptocurrencies, but they do use blockchain technology. Many NFTs are based on Ethereum, where the blockchain serves as a ledger for all the transactions related to said NFT and the properties it represents.5) How to make an NFT?

Anyone can create an NFT. All you need is a digital wallet, some ethereum tokens and a connection to an NFT marketplace where you’ll be able to upload and sell your creations

6) How to validate the authencity of an NFT?

When you purchase a stock in NFT, that purchase is recorded on the blockchain—the bitcoin ledger of transactions—and that entry acts as your proof of ownership.

7) How is an NFT valued? What are the most expensive NFTs?

The value of an NFT varies a lot based on the digital asset up for grabs. People use NFTs to trade and sell digital art, so when creating an NFT, you should consider the popularity of your digital artwork along with historical statistics.

In the year 2021, a digital artist called Pak created an artwork called The Merge. It was sold on the Nifty Gateway NFT market for $91.8 million.

8) Can NFTs be used as an investment?

Non-fungible tokens can be used in investment opportunities. One can purchase an NFT and resell it at a profit. Certain NFT marketplaces let sellers of NFTs keep a percentage of the profits from sales of the assets they create.

9) Will NFTs be the future of art and collectibles?

Many people want to buy NFTs because it lets them support the arts and own something cool from their favorite musicians, brands, and celebrities. NFTs also give artists an opportunity to program in continual royalties if someone buys their work. Galleries see this as a way to reach new buyers interested in art.

10) How do we buy an NFTs?

There are many places to buy digital assets, like opensea and their policies vary. On top shot, for instance, you sign up for a waitlist that can be thousands of people long. When a digital asset goes on sale, you are occasionally chosen to purchase it.

11) Can i mint NFT for free?

To mint an NFT token, you must pay some amount of gas fee to process the transaction on the Etherum blockchain, but you can mint your NFT on a different blockchain called Polygon to avoid paying gas fees. This option is available on OpenSea and this simply denotes that your NFT will only be able to trade using Polygon's blockchain and not Etherum's blockchain. Mintable allows you to mint NFTs for free without paying any gas fees.

12) Do i own an NFT if i screenshot it?

The answer is no. Non-Fungible Tokens are minted on the blockchain using cryptocurrencies such as Etherum, Solana, Polygon, and so on. Once a Non-Fungible Token is minted, the transaction is recorded on the blockchain and the contract or license is awarded to whoever has that Non-Fungible Token in their wallet.

12) Why are people investing so much in NFT?

 Non-fungible tokens have gained the hearts of people around the world, and they have given digital creators the recognition they deserve. One of the remarkable things about non-fungible tokens is that you can take a screenshot of one, but you don’t own it. This is because when a non-fungible token is created, then the transaction is stored on the blockchain, and the license or contract to hold such a token is awarded to the person owning the token in their digital wallet.

You can sell your work and creations by attaching a license to it on the blockchain, where its ownership can be transferred. This lets you get exposure without losing full ownership of your work. Some of the most successful projects include Cryptopunks, Bored Ape Yatch Club NFTs, SandBox, World of Women and so on. These NFT projects have gained popularity globally and are owned by celebrities and other successful entrepreneurs. Owning one of these NFTs gives you an automatic ticket to exclusive business meetings and life-changing connections.

Final Saying

That’s a wrap. Hope you guys found this article enlightening. I just answer some question with my limited knowledge about NFTs. If you have any questions or suggestions, feel free to drop them in the comment section below. Also I have a question for you, Is bitcoin an NFTs? let me know in The comment section below

Match ID: 13 Score: 2.86 source: www.crunchhype.com age: 230 days
qualifiers: 2.86 movie

Filter efficiency 98.177 (14 matches/768 results)


RSS Rabbit links users to publicly available RSS entries.
Vet every link before clicking! The creators accept no responsibility for the contents of these entries.






We're not prepared to take user feedback yet. Check back soon!

rssRabbit quadric