********** CLIMATE **********
return to top
Methane to food waste: eight ways to attempt to stay within 1.5C
Tue, 21 Mar 2023 19:10:39 GMT
Latest IPCC report highlights key measures countries must take to avoid climate catastrophe
The Intergovernmental Panel on Climate Change published the “synthesis report” of its sixth assessment report (AR6) on Monday. Eight years in preparation, this mammoth report encompasses the entire range of human knowledge of the climate system, compiled by hundreds of scientists from thousands of academic papers, and published in four parts, in August 2021, February and April 2022, and March 2023.
The report drew together the most important findings – but also highlighted some key measures that governments and countries must take immediately if we are to avoid climate catastrophe:
Continue reading...Positive framing of otherwise grim report a counterblast to those who dismiss hopes of limiting global heating to 1.5C
Avoiding the worst ravages of climate breakdown is still possible, and there are “multiple, feasible and effective options” for doing so, the Intergovernmental Panel on Climate Change has said.
Hoesung Lee, chair of the body, which is made up of the world’s leading climate scientists, made clear that – despite the widespread damage already being caused by extreme weather, and the looming threat of potentially catastrophic changes – the future was still humanity’s to shape.
Continue reading...Growing signs that Coalition leadership could swing weight behind bill to alter constitution. Follow the day’s news live
It’s world water day!
Which I didn’t know. And I didn’t get water anything. Awkward.
We’re not supporters of changes that enable companies to buy offsets, because this is just an easy means to cover obligations.
I’ve had the major fossil fuels companies of the world try and argue with me that they can go zero net carbon per barrel of oil just by buying offsets. Which is code for ‘we’re not going to change a thing, we’re just going to buy these half-real carbon credits’.”
Continue reading...Councillors vote to chop down trees in Coton Orchard for busway from Cambridge to Cambourne
Hundreds of trees in an orchard designated as a habitat of principal importance in England should be felled to build a new busway to tackle climate change, councillors in Cambridgeshire voted on Tuesday.
The county council voted by 33 to 26 to approve a new public transport busway, which will use optically guided electric or hybrid buses on its route, to provide links between Cambridge and Cambourne, an expanding new town eight miles outside the city.
Continue reading...A pact to phase out fossil fuels in November’s UN climate talks is the only credible response to the warnings of scientists
Yesterday the Intergovernmental Panel on Climate Change released a new synthesis report. The document is important because 195 governments commissioned it and the summary was agreed line by line. It is accepted fact by nations worldwide, and a shared basis for future action.
The report’s conclusions are terrifying and wearily familiar. Every region is experiencing “widespread adverse impacts”. Almost half the world’s population is “highly vulnerable” to climate change impacts. Expected repercussions will escalate rapidly. It concludes that there is a “rapidly closing window of opportunity” to secure a livable future.
Simon Lewis is professor of global change science at University College London and University of Leeds
Continue reading...Regulation may allow ‘hydrogen-ready’ boilers that can run on fossil fuel gas, and are unlikely ever to use hydrogen
Ministers are preparing to allow new houses to continue to be fitted with gas boilers, long after they were supposed to be phased out, campaigners fear.
A loophole being considered for the forthcoming future homes standard, a housing regulation in England intended to reduce greenhouse gas emissions from newly built homes in line with the net zero target, would allow new homes to be fitted with “hydrogen-ready” boilers.
Continue reading...IPCC report says only swift and drastic action can avert irrevocable damage to world
Scientists have delivered a “final warning” on the climate crisis, as rising greenhouse gas emissions push the world to the brink of irrevocable damage that only swift and drastic action can avert.
The Intergovernmental Panel on Climate Change (IPCC), made up of the world’s leading climate scientists, set out the final part of its mammoth sixth assessment report on Monday.
Continue reading...Tesla’s investor day on 1 March began with a rambling, detailed discourse on energy and the environment before transitioning into a series of mostly predictable announcements and boasts. And then, out of nowhere, came an absolute bombshell: “We have designed our next drive unit, which uses a permanent-magnet motor, to not use any rare-earth elements at all,” declared Colin Campbell, Tesla’s director of power-train engineering.
It was a stunning disclosure that left most experts in permanent magnetism wary and perplexed. Alexander Gabay, a researcher at the University of Delaware, states flatly: “I am skeptical that any non-rare-earth permanent magnet could be used in a synchronous traction motor in the near future.” And at Uppsala University, in Sweden, Alena Vishina, a physicist, elaborates, “I’m not sure it’s possible to use only rare-earth-free materials to make a powerful and efficient motor.”
The problem here is physics, which not even Tesla can alter.
And at a recent magnetics conference Ping Liu, a professor at the University of Texas, in Arlington, asked other researchers what they thought of Tesla’s announcement. “No one fully understands this,” he reports. (Tesla did not respond to an e-mail asking for elaboration of Campbell’s comment.)
Tesla’s technical prowess should never be underestimated. But on the other hand, the company—and in particular, its CEO—has a history of making sporadic sensational claims that don’t pan out (we’re still waiting for that US $35,000 Model 3, for example).
The problem here is physics, which not even Tesla can alter. Permanent magnetism occurs in certain crystalline materials when the spins of electrons of some of the atoms in the crystal are forced to point in the same direction. The more of these aligned spins, the stronger the magnetism. For this, the ideal atoms are ones that have unpaired electrons swarming around the nucleus in what are known as 3d orbitals. Tops are iron, with four unpaired 3d electrons, and cobalt, with three.
But 3d electrons alone are not enough to make superstrong magnets. As researchers discovered decades ago, magnetic strength can be greatly improved by adding to the crystalline lattice atoms with unpaired electrons in the 4f orbital—notably the rare-earth elements neodymium, samarium, and dysprosium. These 4f electrons enhance a characteristic of the crystalline lattice called magnetic anisotropy—in effect, they promote adherence of the magnetic moments of the atoms to the specific directions in the crystal lattice. That, in turn, can be exploited to achieve high coercivity, the essential property that lets a permanent magnet stay magnetized. Also, through several complex physical mechanisms, the unpaired 4f electrons can amplify the magnetism of the crystal by coordinating and stabilizing the spin alignment of the 3d electrons in the lattice.
Since the 1980s, a permanent magnet based on a compound of neodymium, iron, and boron (NdFeB), has dominated high-performance applications, including motors, smartphones, loudspeakers, and wind-turbine generators. A 2019 study by Roskill Information Services, in London, found that more than 90 percent of the permanent magnets used in automotive traction motors were NdFeB.
So if not rare-earth permanent magnets for Tesla’s next motor, then what kind? Among experts willing to speculate, the choice was unanimous: ferrite magnets. Among the non-rare-earth permanent magnets invented so far, only two are in large-scale production: ferrites and another type called Alnico (aluminum nickel cobalt). Tesla isn’t going to use Alnico, a half-dozen experts contacted by IEEE Spectrum insisted. These magnets are weak and, more important, the world supply of cobalt is so fraught that they make up less than 2 percent of the permanent-magnet market.
There are more than a score of permanent magnets that use no rare-earth elements, or don’t use much of them. But none of these have made an impact outside the laboratory.
Ferrite magnets, based on a form of iron oxide, are cheap and account for nearly 30 percent of the permanent-magnet market by sales. But they, too, are weak (one major use is holding refrigerator doors shut). A key performance indicator of a permanent magnet is its maximum energy product, measured in megagauss-oersteds (MGOe). It reflects both the strength of a magnet as well as its coercivity. For the type of NdFeB commonly used in automotive traction motors, this value is generally around 35 MGOe. For the best ferrite magnets, it is around 4.
“Even if you get the best-performance ferrite magnet, you will have performance about five to 10 times below neodymium-iron-boron,” says Daniel Salazar Jaramillo, a magnetics researcher at the Basque Center for Materials, Applications, and Nanostructures, in Spain. So compared to a synchronous motor built with NdFeB magnets, one based on ferrite magnets will be much larger and heavier, much weaker, or some combination of the two.
To be sure, there are more than a score of other permanent magnets that use no rare-earth elements or don’t use much of them. But none of these have made an impact outside the laboratory. The list of attributes needed for a commercially successful permanent magnet includes high field strength, high coercivity, tolerance of high temperatures, good mechanical strength, ease of manufacturing, and lack of reliance on elements that are scarce, toxic, or problematic for some other reason. All of the candidates today fail to tick one or more of these boxes.
Iron-nitride magnets, such as this one from startup Niron Magnetics, are among the most promising of an emerging crop of permanent magnets that do not use rare-earth elements.Niron Magnetics
But give it a few more years, say some researchers, and one or two of these could very well break through. Among the most promising: iron nitride, Fe16N2. A Minneapolis startup, Niron Magnetics, is now commercializing technology that was pioneered with funding from ARPA-E by Jian Ping Wang at the University of Minnesota in the early 2000s, after earlier work at Hitachi. Niron’s executive vice president, Andy Blackburn, told Spectrum that the company intends to introduce its first product late in 2024. Blackburn says it will be a permanent magnet with an energy product above 10 MGOe, for which he anticipates applications in loudspeakers and sensors, among others. If it succeeds, it will be the first new commercial permanent magnet since NdFeB, 40 years ago, and the first commercial non-rare-earth permanent magnet since strontium ferrite, the best ferrite type, 60 years ago.
Niron’s first offering will be followed in 2025 by a magnet with an energy product above 30 MGOe, according to Blackburn. For this he makes a rather bold prediction: “It’ll have as good or better flux than neodymium. It’ll have the coercivity of a ferrite, and it’ll have the temperature coefficients of samarium cobalt”—better than NdFeB. If the magnet really manages to combine all those attributes (a big if), it would be very well suited for use in the traction motors of electric vehicles.
There will be more to come, Blackburn declares. “All these new nanoscale-engineering capabilities have allowed us to create materials that would have been impossible to make 20 years ago,” he says.
Gov. Greg Abbott’s budget cuts led to the release of a man who went on to be accused in the killings.
The post Top Cop Scapegoats Reform DA for Double Murder in Austin appeared first on The Intercept.
As technology continues to evolve, STEM education is needed more than ever. With the vast technical expertise of its 400,000-plus members and volunteers, IEEE is a leader in engineering and technology education. Its technical societies and its councils, sections, and regional groups offer educational events and resources at every level to support technical professions and prepare the workforce of tomorrow.
IEEE offers many ways to support the educational needs of learners. For preuniversity students, the organization offers summer camps and other opportunities to explore science, technology, engineering, and mathematics careers. IEEE’s continuing education courses allow professionals to stay up to date on technology, keep their skills sharp, and learn new things.
From 2 to 8 April, IEEE is highlighting resources available to students, educators, and technical professionals with IEEE Education Week. The annual celebration highlights educational opportunities provided by the world’s largest technical professional association and its many organizational units, societies, and councils.
Here are some of the events and resources available during this year’s Education Week.
Climate Change: IEEE’s Role in Bringing Technology Solutions to Meet the Challenge
3 April, noon to 1 p.m. EDT
IEEE President and CEO Saifur Rahman kicks off Education Week with a session on how the organization can serve as a vital connection between policymakers and the engineering and technology communities in bringing technological solutions to meet the universal challenge of climate change. Rahman plans to share how IEEE is committed to helping mitigate the effects of climate change through pragmatic and accessible technical solutions, as well as by providing engineers and technologists with a neutral space for discussion and action. The webinar also addresses the importance of educating the energy workforce.
3 April, 9 to 10 a.m. EDT
IEEE REACH (Raising Engineering Awareness through the Conduit of History) provides teachers with resources to help them explain the history of technology and the roles played by engineers. During this webinar, participants can learn how REACH can enhance the classroom experience.
5 April, 11 to 11:45 a.m. EDT
Many people are sharing their expertise on TikTok, Youtube and other online platforms. When sharing knowledge in a multimedia-rich environment, there are research-proven principles that can be applied to enhance the presentation—which in turn promotes knowledge transfer. This webinar is designed to show participants how to apply the principles to their presentations.
Here are some additional offerings and resources available during IEEE Education Week.
For a list of webinars and events and more resources, visit the IEEE Education Week website.
IEEE-affiliated groups can participate in IEEE Education Week by offering events, resources, and special offers such as discounted courses. Additionally, a tool kit is available to help groups promote IEEE Education Week and their event through newsletters, social media, and more.
The Education Week website provides special offers and discounts as well. You also can support education programs by donating to the IEEE Foundation.
Check out the IEEE Education Week video to learn more.
You do not need to be a member to participate in IEEE Education Week; however, members receive discounted or free access to many of the events and resources.
If you’re not an IEEE member, now would be a great time to join.
For Synopsys Chief Executive Aart de Geus, running the electronic design automation behemoth is similar to being a bandleader. He brings together the right people, organizes them into a cohesive ensemble, and then leads them in performing their best.
De Geus, who helped found the company in 1986, has some experience with bands. The IEEE Fellow has been playing guitar in blues and jazz bands since he was an engineering student in the late 1970s.
Much like jazz musicians improvising, engineers go with the flow at team meetings, he says: One person comes up with an idea, and another suggests ways to improve it.
“There are actually a lot of commonalities between my music hobby and my other big hobby, Synopsys,” de Geus says.
Employer
Synopsys
Title
CEO
Member grade
Fellow
Alma mater
École Polytechnique Fédérale de Lausanne, Switzerland
Synopsys is now the largest supplier of software that engineers use to design chips, employing about 20,000 people. The company reported US $1.36 billion in revenue in the first quarter of this year.
De Geus is considered a founding father of electronic design automation (EDA), which automates chip design using synthesis and other tools. It was pioneered by him and his team in the 1980s. Synthesis revolutionized digital design by taking the high-level functional description of a circuit and automatically selecting the logic components (gates) and constructing the connections (netlist) to build the circuit. Virtually all large digital chips manufactured today are largely synthesized, using software that de Geus and his team developed.
“Synthesis changed the very nature of how digital chips are designed, moving us from the age of computer-a ided design (CAD) to electronic design automation (EDA),” he says.
During the past three and a half decades, logic synthesis has enabled about a 10 millionfold increase in chip complexity, he says. For that reason, Electrical Business magazine named him one of the 10 most influential executives in 2002, as well as its 2004 CEO of the Year.
Born in Vlaardingen, Netherlands, de Geus grew up mostly in Basel, Switzerland. He earned a master’s degree in electrical engineering in 1978 from the École Polytechnique Fédérale de Lausanne, known as EPFL, in Lausanne.
In the early 1980s, while pursuing a Ph.D. in electrical engineering from Southern Methodist University, in Dallas, de Geus joined General Electric in Research Triangle Park, N.C. There he developed tools to design logic with multiplexers, according to a 2009 oral history conducted by the Computer History Museum. He and a designer friend created gate arrays with a mix of logic gates and multiplexers.
That led to writing the first program for synthesizing circuits optimized for both speed and area, known as SOCRATES. It automatically created blocks of logic from functional descriptions, according to the oral history.
“The problem was [that] all designers coming out of school used Karnaugh maps, [and] knew NAND gates, NOR gates, and inverters,” de Geus explained in the oral history. “They didn’t know multiplexers. So designing with these things was actually difficult.” Karnaugh maps are a method of simplifying Boolean algebra expressions. With NAND and NOR universal logic gates, any Boolean expression can be implemented without using any other gate.
SOCRATES could write a function and 20 minutes later, the program would generate a netlist that named the electronic components in the circuit and the nodes they connected to. By automating the function, de Geus says, “the synthesizer typically created faster circuits that also used fewer gates. That’s a big benefit because fewer is better. Fewer ultimately end up in [a] smaller area on a chip.”
With that technology, circuit designers shifted their focus from gate-level design to designs based on hardware description languages.
Eventually de Geus was promoted to manager of GE’s Advanced Computer-Aided Engineering Group. Then, in 1986, the company decided to leave the semiconductor business. Facing the loss of his job, he decided to launch his own company to continue to enhance synthesis tools.
He and two members of his GE team, David Gregory and Bill Krieger, founded Optimal Solutions in Research Triangle Park. In 1987 the company was renamed Synopsys and moved to Mountain View, Calif.
De Geus says he picked up his management skills and entrepreneurial spirit as a youngster. During summer vacations, he would team up with friends to build forts, soapbox cars, and other projects. He usually was the team leader, he says, the one with plenty of imagination.
“An entrepreneur creates a vision of some crazy but, hopefully, brilliant idea,” he says, laughing. The vision sets the direction for the project, he says, while the entrepreneur’s business side tries to convince others that the idea is realistic enough.
“The notion of why it could be important was sort of there,” he says. “But it is the passion that catalyzes something in people.”
That was true during his fort-building days, he says, and it’s still true today.
“Synthesis changed the very nature of how digital designs are being constructed.”
“If you have a good team, everybody chips in something,” he says. “Before you know it, someone on the team has an even better idea of what we could do or how to do it. Entrepreneurs who start a company often go through thousands of ideas to arrive at a common mission. I’ve had the good fortune to be on a 37-year mission with Synopsys.”
At the company, de Geus sees himself as “the person who makes the team cook. It’s being an orchestrator, a bandleader, or maybe someone who brings out the passion in people who are better in both technology and business. As a team, we can do things that are impossible to do alone and that are patently proven to be impossible in the first place.”
He says a few years ago the company came up with the mantra “Yes, if …” to combat a slowly growing “No, because …” mindset.
“‘Yes, if …’ opens doors, whereas the ‘No, because …’ says, ‘Let me prove that it’s not possible,’” he says. “‘Yes, if … ’ leads us outside the box into ‘It’s got to be possible. There’s got to be a way.’”
De Geus says his industry is going through “extremely challenging times—technically, globally, and business-wise—and the ‘If … ’ part is an acknowledgment of that. I found it remarkable that once a group of people acknowledge [something] is difficult, they become very creative. We’ve managed to get the whole company to embrace ‘Yes, if …’
“It is now in the company’s cultural DNA.”
One of the issues Synopsys is confronted with is the end of Moore’s Law, de Geus says. “But no worries,” he says. “We are facing an unbelievable new era of opportunity, as we have moved from ‘Classic Moore’ scale complexity to ‘SysMoore,’ which unleashes systemic complexity with the same Moore’s Law exponential ambition!”
He says the industry is moving its focus from single chips to multichip modules, with chips closely placed together on top of a larger, “silicon interposer” chip. In some cases, such as for memory, chips are stacked on top of each other.
“How do you make the connectivity between those chips as fast as possible? How can you technically make these pieces work? And then how can you make it economically viable so it is producible, reliable, testable, and verifiable? Challenging, but so powerful,” he says. “Our big challenge is to make it all work together.”
Pursuing engineering was a calling for de Geus. Engineering was the intersection of two things he loved: carrying out a vision and building things. Notwithstanding the recent wave of tech-industry layoffs, he says he believes engineering is a great career.
“Just because a few companies have overhired or are redirecting themselves doesn’t mean that the engineering field is in a downward trend,” he says. “I would argue the opposite, for sure in the electronics and software space, because the vision of ‘smart everything’ requires some very sophisticated capabilities, and it is changing the world!”
During the Moore’s Law era, one’s technical knowledge has had to be deep, de Geus says.
“You became really specialized in simulation or in designing a certain type of process,” he says. “In our field, we need people who are best in class. I like to call them six-Ph.D.-deep engineers. It’s not just schooling deep; it’s schooling and experientially deep. Now, with systemic complexity, we need to bring all these disciplines together; in other words we now need six-Ph.D.-wide engineers too.”
To obtain that type of experience, he recommends university students should get a sense of multiple subdisciplines and then “choose the one that appeals to you.”
“For those who have a clear sense of their own mission, it’s falling in love and finding your passion,” he says. But those who don’t know which field of engineering to pursue should “engage with people you think are fantastic, because they will teach you things such as perseverance, enthusiasm, passion, what excellence is, and make you feel the wonder of collaboration.” Such people, he says, can teach you to “enjoy work instead of just having a job. If work is also your greatest hobby, you’re a very different person.”
De Geus says engineers must take responsibility for more than the technology they create.
“I always liked to say that ‘he or she who has the brains to understand should have the heart to help.’” With the growing challenges the world faces, I now add that they should also have the courage to act,” he says. “What I mean is that we need to look and reach beyond our field, because the complexity of the world needs courageous management to not become the reason for its own destruction.”
He notes that many of today’s complexities are the result of fabulous engineering, but the “side effects—and I am talking about CO2, for example—have not been accounted for yet, and the engineering debt is now due.”
De Geus points to the climate crisis: “It is the single biggest challenge there is. It’s both an engineering and a social challenge. We need to figure out a way to not have to pay the whole debt. Therefore, we need to engineer rapid technical transitions while mitigating the negatives of the equation. Great engineering will be decisive in getting there.”
For about as long as engineers have talked about beaming solar power to Earth from space, they’ve had to caution that it was an idea unlikely to become real anytime soon. Elaborate designs for orbiting solar farms have circulated for decades—but since photovoltaic cells were inefficient, any arrays would need to be the size of cities. The plans got no closer to space than the upper shelves of libraries.
That’s beginning to change. Right now, in a sun-synchronous orbit about 525 kilometers overhead, there is a small experimental satellite called the Space Solar Power Demonstrator One (SSPD-1 for short). It was designed and built by a team at the California Institute of Technology, funded by donations from the California real estate developer Donald Bren, and launched on 3 January—among 113 other small payloads—on a SpaceX Falcon 9 rocket.
“To the best of our knowledge, this would be the first demonstration of actual power transfer in space, of wireless power transfer,” says Ali Hajimiri, a professor of electrical engineering at Caltech and a codirector of the program behind SSPD-1, the Space Solar Power Project.
The Caltech team is waiting for a go-ahead from the operators of a small space tug to which it is attached, providing guidance and attitude control. If all goes well, SSPD-1 will spend at least five to six months testing prototype components of possible future solar stations in space. In the next few weeks, the project managers hope to unfold a lightweight frame, called DOLCE (short for Deployable on-Orbit ultraLight Composite Experiment), on which parts of future solar arrays could be mounted. Another small assembly on the spacecraft contains samples of 32 different types of photovoltaic cells, intended to see which would be most efficient and robust. A third part of the vehicle contains a microwave transmitter, set up to prove that energy from the solar cells can be sent to a receiver. For this first experiment, the receivers are right there on board the spacecraft, but if it works, an obvious future step would be to send electricity via microwave to receivers on the ground.
Caltech’s Space Solar Power Demonstrator, shown orbiting Earth in this artist’s conception, was launched on 3 January.Caltech
One can dismiss the 50-kilogram SSPD-1 as yet another nonstarter, but a growing army of engineers and policymakers take solar energy from space seriously. Airbus, the European aerospace company, has been testing its own technology on the ground, and government agencies in China, Japan, South Korea, and the United States have all mounted small projects. “Recent technology and conceptual advances have made the concept both viable and economically competitive,” said Frazer-Nash, a British engineering consultancy, in a 2021 report to the U.K. government. Engineers working on the technology say microwave power transmissions would be safe, unlike ionizing radiation, which is harmful to people or other things in its path.
No single thing has happened to start this renaissance. Instead, say engineers, several advances are coming together.
For one thing, the cost of launching hardware into orbit keeps dropping, led by SpaceX and other, smaller companies such as Rocket Lab. SpaceX has a simplified calculator on its website, showing that if you want to launch a 50-kg satellite into sun-synchronous orbit, they’ll do it for US $275,000.
Meanwhile, photovoltaic technology has improved, step by step. Lightweight electronic components keep getting better and cheaper. And there is political pressure as well: Governments and major companies have made commitments to decarbonize in the battle against global climate change, committing to renewable energy sources to replace fossil fuels.
Most solar power, at least for the foreseeable future, will be Earth-based, which will be cheaper and easier to maintain than anything anyone can launch into space. Proponents of space-based solar power say that for now, they see it as best used for specialty needs, such as remote outposts, places recovering from disasters, or even other space vehicles.
But Hajimiri says don’t underestimate the advantages of space, such as unfiltered sunlight that is far stronger than what reaches the ground and is uninterrupted by darkness or bad weather—if you can build an orbiting array light enough to be practical.
Most past designs, dictated by the technology of their times, included impossibly large truss structures to hold solar panels and wiring to route power to a central transmitter. The Caltech team would dispense with all that. An array would consist of thousands of independent tiles as small as 100 square centimeters, each with its own solar cells, transmitter, and avionics. They might be loosely connected, or they might even fly in formation.
Time-lapse images show the experimental DOLCE frame for an orbiting solar array being unfolded in a clean room.Caltech
“The analogy I like to use is that it’s like an army of ants instead of an elephant,” says Hajimiri. Transmission to receivers on the ground could be by phased array—microwave signals from the tiles synchronized so that they can be aimed with no moving parts. And the parts—the photovoltaic cells with their electronics—could perhaps be so lightweight that they’re flexible. New algorithms could keep their signals focused.
“That’s the kind of thing we’re talking about,” said Harry Atwater, a coleader of the Caltech project, as SSPD-1 was being planned. “Really gossamer-like, ultralight, the limits of mass-density deployable systems.”
If it works out, in 30 years maybe there could be orbiting solar power fleets, adding to the world’s energy mix. In other words, as a recent report from Frazer-Nash concluded, this is “a potential game changer.”
In 2001, a team of engineers at a then-obscure R&D company called AC Propulsion quietly began a groundbreaking experiment. They wanted to see whether an electric vehicle could feed electricity back to the grid. The experiment seemed to prove the feasibility of the technology. The company’s president, Tom Gage, dubbed the system “vehicle to grid” or V2G.
The concept behind V2G had gained traction in the late 1990s after California’s landmark zero-emission-vehicle (ZEV) mandate went into effect and compelled automakers to commercialize electric cars. In V2G, environmental-policy wonks saw a potent new application of the EV that might satisfy many interests. For the utilities, it promised an economical way of meeting rising demand for electricity. For ratepayers, it offered cheaper and more reliable electricity services. Purveyors of EVs would have a new public-policy rationale backing up their market. And EV owners would become entrepreneurs, selling electricity back to the grid.
AC Propulsion’s experiment was timely. It occurred in the wake of the California electricity crisis of 2000 and 2001, when mismanaged deregulation, market manipulation, and environmental catastrophe combined to unhinge the power grid. Some observers thought V2G could prevent the kinds of price spikes and rolling blackouts then plaguing the Golden State. Around the same time, however, General Motors and other automakers were in the process of decommissioning their battery EV fleets, the key component of V2G.
AC Propulsion’s president, Tom Gage, explains the company’s vehicle-to-grid technology at a 2001 conference in Seattle. Photo-illustration: Max-o-matic; photo source: Alec Brooks
The AC Propulsion experiment thus became an obscure footnote in the tortuous saga of the green automobile. A decade later, in the 2010s, the battery EV began an astounding reversal of fortune, thanks in no small part to the engineers at ACP, whose electric-drive technology informed the development of the Roadster, the car that launched Tesla Motors. By the 2020s, automakers around the world were producing millions of EVs a year. And with the revival of the EV, the V2G concept was reborn.
If a modern electronics- and software-laden car can be thought of as a computer on wheels, then an electric car capable of discharging electricity to the grid might be considered a power plant on wheels. And indeed, that’s how promoters of vehicle-to-grid technology perceive the EV.
Keep in mind, though, that electricity’s unique properties pose problems to anyone who would make a business of producing and delivering it. Electricity is a commodity that is bought and sold, and yet unlike most other commodities, it cannot easily be stored. Once electricity is generated and passes into the grid, it is typically used almost immediately. If too much or too little electricity is present in the power grid, the network can suddenly become unbalanced.
At the turn of the 20th century, utilities promoted the use of electric truck fleets to soak up excess electricity. Photo-illustration: Max-o-matic; photo source: M&N/Alamy
Some operators of early direct-current power plants at the turn of the 20th century solved the problem of uneven power output from their generators by employing large banks of rechargeable lead-acid batteries, which served as a kind of buffer to balance the flow of electrons. As utilities shifted to more reliable alternating-current systems, they phased out these costly backup batteries.
Then, as electricity entrepreneurs expanded power generation and transmission capacity, they faced the new problem of what to do with all the cheap off-peak, nighttime electricity they could now produce. Utilities reconsidered batteries, not as stationary units but in EVs. As the historian Gijs Mom has noted, enterprising utility managers essentially outsourced the storage of electricity to the owners and users of the EVs then proliferating in northeastern U.S. cities. Early utility companies like Boston Edison and New York Edison organized EV fleets, favoring electric trucks for their comparatively capacious batteries.
In the early years of the automobile, battery-powered electric cars were competitive with cars fueled by gasoline and other types of propulsion.Photo-illustration: Max-o-matic; image source: Shawshots/Alamy
The problems of grid management that EVs helped solve faded after World War I. In the boom of the 1920s, U.S. utility barons such as Samuel Insull massively expanded the country’s grid systems. During the New Deal era, the federal government began funding the construction of giant hydropower plants and pushed transmission into rural areas. By the 1950s, the grid was moving electricity across time zones and national borders, tying in diverse sources of supply and demand.
The need for large-scale electrochemical energy storage as a grid-stabilizing source of demand disappeared. When utilities considered storage technology at all in the succeeding decades, it was generally in the form of pumped-storage hydropower, an expensive piece of infrastructure that could be built only in hilly terrain.
It wasn’t until the 1990s that the electric car reemerged as a possible solution to problems of grid electricity. In 1997, Willett Kempton, a professor at the University of Delaware, and Steve Letendre, a professor at Green Mountain College, in Vermont, began publishing a series of journal articles that imagined the bidirectional EV as a resource for electricity utilities. The researchers estimated that, if applied to the task of generating electricity, all of the engines in the U.S. light-duty vehicle fleet would produce around 16 times the output of stationary power plants. Kempton and Letendre also noted that the average light vehicle was used only around 4 percent of the time. Therefore, they reasoned, a fleet of bidirectional EVs could be immensely useful to utilities, even if it was only a fraction the size of the conventional vehicle fleet.
AC Propulsion cofounder Wally Rippel converted a Volkswagen microbus into an electric vehicle while he was still a student at Caltech. Photo-illustration: Max-o-matic; photo source: Herald Examiner Collection/Los Angeles Public Library
The engineers at AC Propulsion (ACP) were familiar with the basic precepts of bidirectional EV power. The company was the brainchild of Wally Rippel and Alan Cocconi, Caltech graduates who had worked in the late 1980s and early 1990s as consultants for AeroVironment, then a developer of lightweight experimental aircraft. The pair made major contributions to the propulsion system for the Impact, a battery-powered concept car that AeroVironment built under contract for General Motors. Forerunner of the famous EV1, the Impact was regarded as the most advanced electric car of its day, thanks to its solid-state power controls, induction motor, and integrated charger. The vehicle inspired California’s ZEV mandate, instituted in 1990. As Cocconi told me, the Impact was bidirectional-capable, although that function wasn’t fully implemented.
AeroVironment had encouraged its engineers to take creative initiative in developing the Impact, but GM tightly managed efforts to translate the idiosyncratic car into a production prototype, which rankled Cocconi and Rippel. Cocconi was also dismayed by the automaker’s decision to equip the production car with an off-board rather than onboard charger, which he believed would limit the car’s utility. In 1992, he and Rippel quit the project and, with Hughes Aircraft engineer Paul Carosa, founded ACP, to further develop battery electric propulsion. The team applied their technology to a two-seat sportscar called the tzero, which debuted in January 1997.
Electric Car tzero 0-60 3.6 sec faster than Tesla Roadster www.youtube.com
Through the 1990s and into the early 2000s, ACP sold its integrated propulsion systems to established automakers, including Honda, Volkswagen, and Volvo, for use in production models being converted into EVs. For car companies, this was a quick and cheap way to gain experience with battery electric propulsion while also meeting any quota they may have been subject to under the California ZEV mandate.
By the turn of the millennium, however, selling EV propulsion systems had become a hard way to make a living. In early 2000, when GM announced it had ceased production of the EV1, it signaled that the automaking establishment was abandoning battery electric cars. ACP looked at other ways of marketing its technology and saw an opportunity in the California electricity crisis then unfolding.
Traditionally, the electricity business combined several discrete services, including some designed to meet demand and others designed to stabilize the network. Since the 1930s, these services had been provided by regulated, vertically integrated utilities, which operated as quasi-monopolies. The most profitable was peaking power—electricity delivered when demand was highest. The less-lucrative stabilization services balanced electricity load and generation to maintain system frequency at 60 hertz, the standard for the United States. In a vertically integrated utility, peaking services essentially subsidized stabilization services.
With deregulation in the 1990s, these aggregated services were unbundled and commodified. In California, regulators separated generation from distribution and sold 40 percent of installed capacity to newly created independent power producers that specialized in peaking power. Grid-stabilization functions were reborn as “ancillary services.” Major utilities were compelled to purchase high-cost peaking power, and because retail prices were capped, they could not pass their costs on to consumers. Moreover, deregulation disincentivized the construction of new power plants. At the turn of the millennium, nearly 20 percent of the state’s generating capacity was idled for maintenance.
General Motors’ Impact debuted at the 1990 Los Angeles Auto Show. It was regarded as the most advanced electric vehicle of its era.Photo-illustration: Max-o-matic; photo source: Alec Brooks
The newly marketized grid was highly unstable, and in 2000 and 2001, things came to a head. Hot weather caused a demand spike, and the accompanying drought (the beginning of the multidecade southwestern megadrought) cut hydropower capacity. As Californians turned on their air conditioners, peaking capacity had to be kept in operation longer. Then market speculators got into the act, sending wholesale prices up 800 percent and bankrupting Pacific Gas & Electric. Under these combined pressures, grid reliability eroded, resulting in rolling blackouts.
With the grid crippled, ACP’s Gage contacted Kempton to discuss whether bidirectional EV power could help. Kempton identified frequency regulation as the optimal V2G market because it was the most profitable of the ancillary services, constituting about 80 percent of what the California Independent System Operator, the nonprofit set up to manage the deregulated grid, then spent on such services.
The result was a demonstration project, a task organized by Alec Brooks, manager of ACP’s tzero production. Like Rippel and Cocconi, Brooks was a Caltech graduate and part of the close-knit community of EV enthusiasts that emerged around the prestigious university. After earning a Ph.D. in civil engineering in 1981, Brooks had joined AeroVironment, where he managed the development of Sunraycer, an advanced solar-powered demonstration EV built for GM, and the Impact. He recruited Rippel and Cocconi for both jobs. During the 1990s, Brooks formed a team at AeroVironment that provided support for GM’s EV programs until he too tired of the corporate routine and joined ACP in 1999.
Before cofounding AC Propulsion, Alan Cocconi worked on Sunraycer, a solar-powered car for GM. Here, he’s testing the car’s motor-drive power electronics.Photo-illustration: Max-o-matic; photo source: Alec Brooks
Working with Gage and Kempton, and consulting with the ISO, Brooks set out to understand how the EV might function as a utility resource.
ACP adapted its second-generation AC-150 drivetrain, which had bidirectional capability, for this application. As Cocconi recalled, the bidirectional function had originally been intended for a different purpose. In the 1990s, batteries had far less capacity than they do today, and for the small community of EV users, the prospect of running out of juice and becoming stranded was very real. In such an emergency, a bidirectional EV with charge to spare could come to the rescue.
With funding from the California Air Resources Board, the team installed an AC-150 drive in a Volkswagen Beetle. The system converted AC grid power to DC power to charge the battery and could also convert DC power from the battery to AC power that could feed both external stand-alone loads and the grid. Over the course of the project, the group successfully demonstrated bidirectional EV power using simulated dispatch commands from the ISO’s computerized energy-management system.
This pair of graphs shows how AC Propulsion’s AC-150 drivetrain performed in a demonstration of grid frequency regulation. The magenta line in the upper graph tracks grid frequency centered around 60 hertz. The lower graph indicates power flowing between the grid and the drivetrain; a negative value means power is being drawn from the grid, while a positive value means power is being sent back to the grid.
Photo-illustration: Max-o-matic; photo source: Alec Brooks
The experiment demonstrated the feasibility of the vehicle-to-grid approach, yet it also revealed the enormous complexities involved in deploying the technology. One unpleasant surprise, Brooks recalled, came with the realization that the electricity crisis had artificially inflated the ancillary-services market. After California resolved the crisis—basically by re-regulating and subsidizing electricity—the bubble burst, making frequency regulation as a V2G service a much less attractive business proposition.
The prospect of integrating EV storage batteries into legacy grid systems also raised concerns about control. The computers responsible for automatically signaling generators to ramp up or down to regulate frequency were programmed to control large thermoelectric and hydroelectric plants, which respond gradually to signals. Batteries, by contrast, respond nearly instantaneously to commands to draw or supply power. David Hawkins, an engineer who served as a chief aide to the ISO’s vice president of operations and advised Brooks, noted that the responsiveness of batteries had unintended consequences when they were used to regulate frequency. In one experiment involving a large lithium-ion battery, the control computer fully charged or discharged the unit in a matter of minutes, leaving no spare capacity to regulate the grid.
In principle, this problem might have been solved with software to govern the charging and discharging. The main barrier to V2G in the early 2000s, it turns out, was that the battery EV would have to be massively scaled up before it could serve as a practical energy-storage resource. And the auto industry had just canceled the battery EV. In its place, automakers promised the fuel-cell electric car, a type of propulsion system that does not easily lend itself to bidirectional power flow.
The dramatic revival of the battery EV in the late 2000s and early 2010s led by Tesla Motors and Nissan revived prospects for the EV as a power-grid resource. This EV renaissance spawned a host of R&D efforts in bidirectional EV power, including ECOtality and the Mid-Atlantic Grid Interactive Cars Consortium. The consortium, organized by Kempton in conjunction with PJM, the regional transmission organization responsible for much of the eastern United States, used a car equipped with an AC-150 drivetrain to further study the use of V2G in the frequency-regulation market.
Over time, however, the research focus in bidirectional EV applications shifted from the grid to homes and commercial buildings. In the wake of the Fukushima nuclear disaster in 2011, for instance, Nissan developed and marketed a vehicle-to-building (V2B) charging system that enabled its Leaf EV to provide backup power.
In 2001, AC Propulsion engineers installed an AC-150 drivetrain in a Volkswagen Beetle to demonstrate the feasibility of V2G technology for regulating frequency on the power grid.Photo-illustration: Max-o-matic; photo source: Alec Brooks
The automaker later entered an R&D partnership with Fermata Energy, a Virginia-based company that develops bidirectional EV power systems. Founded by the entrepreneur and University of Virginia researcher David Slutzky in 2010, Fermata considered and then ruled out the frequency-regulation market, on the grounds that it was too small and unscalable.
Slutsky now believes that early markets for bidirectional EV power will emerge in supplying backup power and supplementing peak loads for individual commercial buildings. Those applications will require institutional fleets of EVs. Slutzky and other proponents of EV power have been pressing for a more favorable regulatory environment, including access to the subsidies that states such as California offer to users of stationary storage batteries.
Advocates believe that V2G can help pay for EV batteries. While interest in this idea seems likely to grow as EVs proliferate, the prospect of electric car owners becoming power entrepreneurs appears more distant. Hawkins, the engineer who advised Brooks, holds that the main barriers to V2G are not so much technological as economic: Viable markets need to emerge. The everyday participant in V2G, he argues, would face the difficult task of attempting to arbitrage the difference between wholesale and retail prices while still paying the retail rate. In principle, EV owners could take advantage of the same feed-in tariffs and net-metering schemes designed to enable homeowners to sell surplus solar power back to the grid. But marketing rooftop solar power has proven more complicated and costly for suburbanites than initially assumed, and the same would likely hold true for EV power.
Another major challenge is how to balance the useful lifetime of EV batteries in transportation and non-vehicle applications. That question turns on understanding how EV batteries will perform and age in stationary-power roles. Users would hardly be further ahead, after all, if they substantially degraded their batteries in the act of paying them off. Grid managers could also face problems if they come to depend on EV batteries that prove unreliable or become unavailable as driving patterns change.
In short, the core conundrum of V2G is the conflict of interest that comes from repurposing privately owned automobiles as power plants. Scaling up this technology will require intimate collaboration between automaking and electricity-making, enterprises with substantially different revenue models and systems of regulation. At the moment, the auto industry does not have a clear interest in V2G.
On the other hand, rising electricity demand, concerns about fossil fuels, greenhouse gases, and climate change, and the challenges of managing intermittent renewable energy have all created new justifications for bidirectional EV power. With the proliferation of EVs over the last decade, more demonstrations of the technology are being staged for a host of applications—sometimes expressed as V2X, or vehicle-to-everything. Some automakers, notably Nissan and now Ford, already sell bidirectional EVs, and others are experimenting with the technology. Enterprises are emerging to equip and manage demonstrations of V2B, V2G, and V2X for utilities and big institutional users of electricity. Some ambitious pilot projects are underway, notably in the Dutch city of Utrecht.
Back in 2002, at the end of their experiment, the engineers at AC Propulsion concluded that what V2G really needed was a powerful institutional champion. They went on to make further important contributions to EV technology. Brooks and Rippel worked for the nascent Tesla Motors, while Cocconi continued at ACP until a cancer diagnosis led him to reevaluate his life. In the mid-2000s, Cocconi sold his stake in the company and devoted himself to aviation, his first love, developing remote-controlled solar-powered aircraft. The rebirth of the battery electric car in the 2010s and 2020s reaffirmed the efforts of these three visionary pioneers.
A strong V2G patron has yet to emerge. Nevertheless, the idea of an off-the-shelf energy storage unit that also provides transportation and pays for itself is likely to remain attractive enough to sustain ongoing interest. Who knows? The electric car might still one day become a power plant on wheels.
The author thanks Alec Brooks, Alan Cocconi, David Hawkins, David Slutzky, and Wally Rippel for sharing their experiences. Parts of this article are adapted from the author’s new book, Age of Auto Electric (MIT Press, 2022).
The 19-seater Dornier 228 propeller plane that took off into the cold blue January sky looked ordinary at first glance. Spinning its left propeller, however, was a 2-megawatt electric motor powered by two hydrogen fuel cells—the right side ran on a standard kerosene engine—making it the largest aircraft flown on hydrogen to date. Val Miftakhov, founder and CEO of ZeroAvia, the California startup behind the 10-minute test flight in Gloucestershire, England, called it a “historical day for sustainable aviation.”
Los Angeles–based Universal Hydrogen plans to test a 50-seat hydrogen-powered aircraft by the end of February. Both companies promise commercial flights of retrofitted turboprop aircraft by 2025. French aviation giant Airbus is going bigger with a planned 2026 demonstration flight of its iconic A380 passenger airplane, which will fly using hydrogen fuel cells and by burning hydrogen directly in an engine. And Rolls Royce is making headway on aircraft engines that burn pure hydrogen.
The aviation industry, responsible for some 2.5 percent of global carbon emissions, has committed to net-zero emissions by 2050. Getting there will require several routes, including sustainable fuels, hybrid-electric engines, and battery-electric aircraft.
Hydrogen is another potential route. Whether used to make electricity in fuel cells or burned in an engine, it combines with oxygen to emit water vapor. If green hydrogen scales up for trucks and ships, it could be a low-cost fuel without the environmental issues of batteries.
Flying on hydrogen brings storage and aircraft-certification challenges, but aviation companies are doing the groundwork now for hydrogen flight by 2035. “Hydrogen is headed off to the sky, and we’re going to take it there,” says Amanda Simpson, vice president for research and technology at Airbus Americas.
The most plentiful element, hydrogen is also the lightest—key for an industry fighting gravity—packing three times the energy of jet fuel by weight. The problem with hydrogen is its volume. For transport, it has to be stored in heavy tanks either as a compressed high-pressure gas or a cryogenic liquid.
ZeroAvia is using compressed hydrogen gas, since it is already approved for road transport. Its test airplane had two hydrogen fuel cells and tanks sitting inside the cabin, but the team is now thinking creatively about a compact system with minimal changes to aircraft design to speed up certification in the United States and Europe. The fuel cells’ added weight could reduce flying range, but “that’s not a problem, because aircraft are designed to fly much further than they’re used,” says vice president of strategy James McMicking.
The company has backing from investors that include Bill Gates and Jeff Bezos; partnerships with British Airways and United Airlines; and 1,500 preorders for its hydrogen-electric power-train system, half of which are for smaller, 400-kilometer-range 9- to 19-seaters.
By 2027, ZeroAvia plans to convert larger, 70-seater turboprop aircraft with twice the range, used widely in Europe. The company is developing 5-MW electric motors for those, and it plans to switch to more energy-dense liquid hydrogen to save space and weight. The fuel is novel for the aviation industry and could require a longer regulatory approval process, McMicking says.
Next will come a 10-MW power train for aircraft with 100 to 150 seats, “the workhorses of the industry,” he says. Those planes—think Boeing 737—are responsible for 60 percent of aviation emissions. Making a dent in those with hydrogen will require much more efficient fuel cells. ZeroAvia is working on proprietary high-temperature fuel cells for that, McMicking says, with the ability to reuse the large amounts of waste heat generated. “We have designs and a technology road map that takes us into jet-engine territory for power,” he says.
Universal Hydrogen
Universal Hydrogen, which counts Airbus, GE Aviation, and American Airlines among its strategic investors, is placing bets on liquid hydrogen. The startup, “a hydrogen supply and logistics company at our core,” wants to ensure a seamless delivery network for hydrogen aviation as it catches speed, says founder and CEO Paul Eremenko. The company sources green hydrogen, turns it into liquid, and puts it in relatively low-tech insulated aluminum tanks that it will deliver via road, rail, or ship. “We want them certified by the Federal Aviation Administration for 2025, which means they can’t be a science project,” he says.
The cost of green hydrogen is expected to be on par with kerosene by 2025, Eremenko says. But “there’s nobody out there with an incredible hydrogen-airplane solution. It’s a chicken-and-egg problem.”
To crack it, Universal Hydrogen partnered with leading fuel-cell-maker Plug Power to develop a few thousand conversion kits for regional turboprop airplanes. The kits swap the engine in its streamlined housing (also known as nacelle) for a fuel-cell stack, power electronics, and a 2-MW electric motor. While the company’s competitors use batteries as buffers during takeoff, Eremenko says Universal uses smart algorithms to manage fuel cells, so they can ramp up and respond quickly. “We are the Nespresso of hydrogen,” he says. “We buy other people’s coffee, put it into capsules, and deliver to customers. But we have to build the first coffee machine. We’re the only company incubating the chicken and egg at the same time.”
This rendering of an Airbus A380 demonstrator flight (presently slated for 2026) reveals current designs on an aircraft that’s expected to fly using fuel cells and by burning hydrogen directly in the engine. Airbus
Fuel cells have a few advantages over a large central engine. They allow manufacturers to spread out smaller propulsion motors over an aircraft, giving them more design freedom. And because there are no high-temperature moving parts, maintenance costs can be lower. For long-haul aircraft, however, the weight and complexity of high-power fuel cells makes hydrogen-combustion engines appealing.
Airbus is considering both fuel-cell and combustion propulsion for its ZEROe hydrogen aircraft system. It has partnered with German automotive fuel-cell-maker Elring Klinger and, for direct combustion engines, with CFM International, a joint venture between GE Aviation and Safran. Burning liquid hydrogen in today’s engines is still expected to require slight modifications, such as a shorter combustion chamber and better seals.
Airbus is also evaluating hybrid propulsion concepts with a hydrogen-engine-powered turbine and a hydrogen-fuel-cell-powered motor on the same shaft, says Simpson, of Airbus Americas. “Then you can optimize it so you use both propulsion systems for takeoff and climb, and then turn one off for cruising.”
The company isn’t limiting itself to simple aircraft redesign. Hydrogen tanks could be stored in a cupola on top of the plane, pods under the wings, or a large tank at the back, Simpson says. Without liquid fuel in the wings, as in traditional airplanes, she says, “you can optimize wings for aerodynamics, make them thinner or longer. Or maybe a blended-wing body, which could be very different. This opens up the opportunity to optimize aircraft for efficiency.” Certification for such new aircraft could take years, and Airbus isn’t expecting commercial flights until 2035.
Conventional aircraft made today will be around in 2050 given their 25- to 30-year life-span, says Robin Riedel, an analyst at McKinsey & Co. Sustainable fuels are the only green option for those. He says hydrogen could play a role there, through “power-to-liquid technology, where you can mix hydrogen and captured carbon dioxide to make aviation fuel.”
Even then, Riedel thinks hydrogen will likely be a small part of aviation’s sustainability solution until 2050. “By 2070, hydrogen is going to play a much bigger role,” he says. “But we have to get started on hydrogen now.” The money that Airbus and Boeing are putting into hydrogen is a small fraction of aerospace, he says, but big airlines investing in hydrogen companies or placing power-train orders “shows there is desire.”
The aviation industry has to clean up if it is to grow, Simpson says. Biofuels are a stepping-stone, because they reduce only carbon emissions, not other harmful ones. “If we’re going to move towards clean aviation, we have to rethink everything from scratch and that’s what ZEROe is doing,” she says. “This is an opportunity to make not an evolutionary change but a truly revolutionary one.”
This sponsored article is brought to you by COMSOL.
History teaches that the Industrial Revolution began in England in the mid-18th century. While that era of sooty foundries and mills is long past, manufacturing remains essential — and challenging. One promising way to meet modern industrial challenges is by using additive manufacturing (AM) processes, such as powder bed fusion and other emerging techniques. To fulfill its promise of rapid, precise, and customizable production, AM demands more than just a retooling of factory equipment; it also calls for new approaches to factory operation and management.
That is why Britain’s Manufacturing Technology Centre (MTC) has enhanced its in-house metal powder bed fusion AM facility with a simulation model and app to help factory staff make informed decisions about its operation. The app, built using the Application Builder in the COMSOL Multiphysics software, shows the potential for pairing a full-scale AM factory with a so-called “digital twin” of itself.
“The model helps predict how heat and humidity inside a powder bed fusion factory may affect product quality and worker safety,” says Adam Holloway, a technology manager within the MTC’s modeling team. “When combined with data feeds from our facility, the app helps us integrate predictive modeling into day-to-day decision-making.” The MTC project demonstrates the benefits of placing simulation directly into the hands of today’s industrial workforce and shows how simulation could help shape the future of manufacturing.
“We’re trying to present the findings of some very complex calculations in a simple-to-understand way. By creating an app from our model, we can empower staff to run predictive simulations on laptops during their daily shifts.”
—Adam Holloway, MTC Technology Manager
To help modern British factories keep pace with the world, the MTC promotes high-value manufacturing throughout the United Kingdom. The MTC is based in the historic English industrial city of Coventry (Figure 2), but its focus is solely on the future. That is why the team has committed significant human and technical resources to its National Centre for Additive Manufacturing (NCAM).
“Adopting AM is not just about installing new equipment. Our clients are also seeking help with implementing the digital infrastructure that supports AM factory operations,” says Holloway. “Along with enterprise software and data connectivity, we’re exploring how to embed simulation within their systems as well.”
The NCAM’s Digital Reconfigurable Additive Manufacturing for Aerospace (DRAMA) project provides a valuable venue for this exploration. Developed in concert with numerous manufacturers, the DRAMA initiative includes the new powder bed fusion AM facility mentioned previously. With that mini factory as DRAMA’s stage, Holloway and his fellow simulation specialists play important roles in making its production of AM aerospace components a success.
What makes a manufacturing process “additive”, and why are so many industries exploring AM methods? In the broadest sense, an additive process is one where objects are created by adding material layer by layer, rather than removing it or molding it. A reductive or subtractive process for producing a part may, for example, begin with a solid block of metal that is then cut, drilled, and ground into shape. An additive method for making the same part, by contrast, begins with empty space! Loose or soft material is then added to that space (under carefully controlled conditions) until it forms the desired shape. That pliable material must then be solidified into a durable finished part.
Different materials demand different methods for generating and solidifying additive forms. For example, common 3D printers sold to consumers produce objects by unspooling warm plastic filament, which bonds to itself and becomes harder as it cools. By contrast, the metal powder bed fusion process (Ref. 1) begins with, as its name suggests, a powdered metal which is then melted by applied heat and re-solidified when it cools. A part produced via the metal powder bed fusion process can be seen in Figure 3.
“The market opportunities for AM methods have been understood for a long time, but there have been many obstacles to large-scale adoption,” Holloway says. “Some of these obstacles can be overcome during the design phase of products and AM facilities. Other issues, such as the impact of environmental conditions on AM production, must be addressed while the facility is operating.”
For instance, maintaining careful control of heat and humidity is an essential task for the DRAMA team. “The metal powder used for the powder bed fusion process (Figure 4) is highly sensitive to external conditions,” says Holloway. “This means it can begin to oxidize and pick up ambient moisture even while it sits in storage, and those processes will continue as it moves through the facility. Exposure to heat and moisture will change how it flows, how it melts, how it picks up an electric charge, and how it solidifies,” he says. “All of these factors can affect the resulting quality of the parts you’re producing.”
Careless handling of powdered metal is not just a threat to product quality. It can threaten the health and safety of workers as well. “The metal powder used for AM processes is flammable and toxic, and as it dries out, it becomes even more flammable,” Holloway says. “We need to continuously measure and manage humidity levels, as well as how loose powder propagates throughout the facility.”
To maintain proper atmospheric conditions, a manufacturer could augment its factory’s ventilation with a full climate control system, but that could be prohibitively expensive. The NCAM estimated that it would cost nearly half a million English pounds to add climate control to its relatively modest facility. But what if they could adequately manage heat and humidity without adding such a complicated system?
Perhaps using multiphysics simulation for careful process management could provide a cost-effective alternative. “As part of the DRAMA program, we created a model of our facility using the computational fluid dynamics (CFD) capabilities of the COMSOL software. Our model (Figure 5) uses the finite element method to solve partial differential equations describing heat transfer and fluid flow across the air domain in our facility,” says Holloway. “This enabled us to study how environmental conditions would be affected by multiple variables, from the weather outside, to the number of machines operating, to the way machines were positioned inside the shop. A model that accounts for those variables helps factory staff adjust ventilation and production schedules to optimize conditions,” he explains.
The DRAMA team made their model more accessible by building a simulation app of it with the Application Builder in COMSOL Multiphysics (Figure 6). “We’re trying to present the findings of some very complex calculations in a simple-to-understand way,” Holloway explains. “By creating an app from our model, we can empower staff to run predictive simulations on laptops during their daily shifts.”
The app user can define relevant boundary conditions for the beginning of a factory shift and then make ongoing adjustments. Over the course of a shift, heat and humidity levels will inevitably fluctuate. Perhaps factory staff should alter the production schedule to maintain part quality, or maybe they just need to open doors and windows to improve ventilation. Users can change settings in the app to test the possible effects of actions like these. For example, Figure 8 presents isothermal surface plots that show the effect that opening the AM machines’ build chambers has on air temperature, while Figure 9 shows how airflow is affected by opening the facility doors.
While the current app is an important step forward, it does still require workers to manually input relevant data. Looking ahead, the DRAMA team envisions something more integral, and therefore, more powerful: a “digital twin” for its AM facility. A digital twin, as described by Ed Fontes in a 2019 post on the COMSOL Blog (Ref. 2), is “a dynamic, continuously updated representation of a real physical product, device, or process.” It is important to note that even the most detailed model of a system is not necessarily its digital twin.
“To make our factory environment model a digital twin, we’d first provide it with ongoing live data from the actual factory,” Holloway explains. “Once our factory model was running in the background, it could adjust its forecasts in response to its data feeds and suggest specific actions based on those forecasts.”
“We want to integrate our predictive model into a feedback loop that includes the actual factory and its staff. The goal is to have a holistic system that responds to current factory conditions, uses simulation to make predictions about future conditions, and seamlessly makes self-optimizing adjustments based on those predictions,” Holloway says. “Then we could truly say we’ve built a digital twin for our factory.”
As an intermediate step toward building a full factory-level digital twin, the DRAMA simulation app has already proven its worth. “Our manufacturing partners may already see how modeling can help with planning an AM facility, but not really understand how it can help with operation,” Holloway says. “We’re showing the value of enabling a line worker to open up the app, enter in a few readings or import sensor data, and then quickly get a meaningful forecast of how a batch of powder will behave that day.”
Beyond its practical insights for manufacturers, the overall project may offer a broader lesson as well: By pairing its production line with a dynamic simulation model, the DRAMA project has made the entire operation safer, more productive, and more efficient. The DRAMA team has achieved this by deploying the model where it can do the most good — into the hands of the people working on the factory floor.
At Moffett Field in Mountain View, Calif., Lighter Than Air (LTA) Research is floating a new approach to a technology that saw its rise and fall a century ago: airships. Although airships have long since been supplanted by planes, LTA, which was founded in 2015 by CEO Alan Weston, believes that through a combination of new materials, better construction techniques, and technological advancements, airships are poised to—not reclaim the skies, certainly—but find a new niche.
Although airships never died off entirely—the Goodyear blimps, familiar to sports fans, are proof of that—the industry was already in decline by 1937, the year of the Hindenburg disaster. By the end of World War II, airships couldn’t compete with the speed airplanes offered, and they required larger crews. Today, what airships still linger serve primarily for advertising and sightseeing.
LTA’s Pathfinder 1 carries bigger dreams than hovering over a sports stadium, however. The company sees a natural fit for airships in humanitarian and relief missions. Airships can stay aloft for long periods of time, in case ground conditions aren’t ideal, have a long range, and carry significant payloads, according to Carl Taussig, LTA’s chief technical officer.
Pathfinder’s cigar-shaped envelope is just over 120 meters in length and 20 meters in diameter. While that dwarfs Goodyear’s current, 75-meter Wingfoot One, it’s still only half the length of the Hindenburg. LTA expects Pathfinder 1 to carry approximately 4 tonnes of cargo, in addition to its crew, water ballast, and fuel. The airship will have a top speed of 65 knots, or about 120 kilometers per hour—on par with the Hindenburg—with a sustained cruise speed of 35 to 40 knots (65 to 75 km/h).
It may not seem much of an advance to be building an airship that flies no faster than the Hindenburg. But Pathfinder 1 carries a lot of new tech that LTA is betting will prove key to an airship resurgence.
For one, airships used to be constructed around riveted aluminum girders, which provided the highest strength-to-weight ratio available at the time. Instead, LTA will be using carbon-fiber tubes attached to titanium hubs. As a result, Pathfinder 1’s primary structure will be both stronger and lighter.
Pathfinder 1’s outer covering is also a step up from past generations. Airships like the 1930s’ Graf Zeppelin had coverings made out of doped cotton canvas. The dope painted on the fabric increased its strength and resiliency. But canvas is still canvas. LTA has instead built its outer coverings out of a three-layer laminate of synthetics. The outermost layer is DuPont’s Tedlar, which is a polyvinyl fluoride. The middle layer is a loose weave of fire-retardant aramid fibers. The inner layer is polyester. “It’s very similar to what’s used in a lot of racing sailboats,” says Taussig. “We needed to modify that material to make it fire resistant and change a little bit about its structural performance.”
LTA Research
But neither the materials science nor the manufacturing advances will take primary credit for LTA’s looked-for success, according to Taussig—instead, it’s the introduction of electronics. “Everything’s electric on Pathfinder,” he says. “All the actuation, all the propulsion, all the actual power is all electrically generated. It’s a fully electric fly-by-wire aircraft, which is not something that was possible 80 years ago.” Pathfinder 1 has 12 electric motors for propulsion, as well as four tail fins with steering rudders controlled by its fly-by-wire system. (During initial test flights, the airship will be powered by two reciprocating aircraft engines).
There’s one other piece of equipment making an appearance on Pathfinder 1 that wasn’t available 80 years ago: lidar. Installed at the top of each of Pathfinder 1’s helium gas cells is an automotive-grade lidar. “The lidar can give us a point cloud showing the entire internal hull of that gas cell,” says Taussig, which can then be used to determine the gas cell’s volume accurately. In flight, the airship’s pilots can use that information, as well as data about the helium’s purity, pressure, and temperature, to better keep the craft pitched properly and to avoid extra stress on the internal structure during flight.
Although LTA’s initial focus is on humanitarian applications, there are other areas where airships might shine one day. “An airship is kind of a ‘tweener,’ in between sea cargo and air freight,” says Taussig. Being fully electric, Pathfinder 1 is also greener than traditional air- or sea-freight options.
After completing Pathfinder 1’s construction late in 2022, LTA plans to conduct a series of ground tests on each of the airship’s systems in the first part of 2023. Once the team is satisfied with those tests, they’ll move to tethered flight tests and finally untethered flight tests over San Francisco’s South Bay later in the year.
The company will also construct an approximately 180-meter-long airship, Pathfinder 3 at its Akron Airdock facility in Ohio. Pathfinder 3 won’t be ready to fly in 2023, but its development shows LTA’s aspirations for an airship renaissance is more than just hot air.
This article appears in the January 2023 print issue as “The Return of the Airship.”
Top Tech 2023: A Special Report
Preview exciting technical developments for the coming year.
Can This Company Dominate Green Hydrogen?
Fortescue will need more electricity-generating capacity than France.
Pathfinder 1 could herald a new era for zeppelins
A New Way to Speed Up Computing
Blue microLEDs bring optical fiber to the processor.
The Personal-Use eVTOL Is (Almost) Here
Opener’s BlackFly is a pulp-fiction fever dream with wings.
Baidu Will Make an Autonomous EV
Its partnership with Geely aims at full self-driving mode.
China Builds New Breeder Reactors
The power plants could also make weapons-grade plutonium.
Economics Drives a Ray-Gun Resurgence
Lasers should be cheap enough to use against drones.
A Cryptocurrency for the Masses or a Universal ID?
What Worldcoin’s killer app will be is not yet clear.
The company’s Condor chip will boast more than 1,000 qubits.
Vagus-nerve stimulation promises to help treat autoimmune disorders.
New satellites can connect directly to your phone.
The E.U.’s first exascale supercomputer will be built in Germany.
A dozen more tech milestones to watch for in 2023.
A rocket built by Indian startup Skyroot has become the country’s first privately developed launch vehicle to reach space, following a successful maiden flight earlier today. The suborbital mission is a major milestone for India’s private space industry, say experts, though more needs to be done to nurture the fledgling sector.
The Vikram-S rocket, named after the founder of the Indian space program, Vikram Sarabhai, lifted off from the Indian Space Research Organization’s (ISRO) Satish Dhawan Space Centre, on India’s east coast, at 11:30 a.m. local time (1 a.m. eastern time). It reached a peak altitude of 89.5 kilometers (55.6 miles), crossing the 80-km line that NASA counts as the boundary of space, but falling just short of the 100 km recognized by the Fédération Aéronautique Internationale.
In the longer run, India’s space industry has ambitions of capturing a significant chunk of the global launch market.
Pawan Kumar Chandana, cofounder of the Hyderabad-based startup, says the success of the launch is a major victory for India’s nascent space industry, but the buildup to the mission was nerve-racking. “We were pretty confident on the vehicle, but, as you know, rockets are very notorious for failure,” he says. “Especially in the last 10 seconds of countdown, the heartbeat was racing up. But once the vehicle had crossed the launcher and then went into the stable trajectory, I think that was the moment of celebration.”
At just 6 meters (20 feet) long and weighing only around 550 kilograms (0.6 tonnes), the Vikram-S is not designed for commercial use. Today’s mission, called Prarambh, which means “the beginning” in Sanskrit, was designed to test key technologies that will be used to build the startup’s first orbital rocket, the Vikram I. The rocket will reportedly be capable of lofting as much as 480 kg up to an 500-km altitude and is slated for a maiden launch next October.
Skyroot cofounder Pawan Kumar Chandana standing in front of the Vikram-S rocket at the Satish Dhawan Space Centre, on the east coast of India.Skyroot
In particular, the mission has validated Skyroot’s decision to go with a novel all-carbon fiber structure to cut down on weight, says Chandana. It also allowed the company to test 3D-printed thrusters, which were used for spin stabilization in Vikram-S but will power the upper stages of its later rockets. Perhaps the most valuable lesson, though, says Chandana, was the complexity of interfacing Skyroot's vehicle with ISRO’s launch infrastructure. “You can manufacture the rocket, but launching it is a different ball game,” he says. “That was a great learning experience for us and will really help us accelerate our orbital vehicle.”
Skyroot is one of several Indian space startups looking to capitalize on recent efforts by the Indian government to liberalize its highly regulated space sector. Due to the dual-use nature of space technology, ISRO has historically had a government-sanctioned monopoly on most space activities, says Rajeswari Pillai Rajagopalan, director of the Centre for Security, Strategy and Technology at the Observer Research Foundation think tank, in New Delhi. While major Indian engineering players like Larsen & Toubro and Godrej Aerospace have long supplied ISRO with components and even entire space systems, the relationship has been one of a supplier and vendor, she says.
But in 2020, Finance Minister Nirmala Sitharaman announced a series of reforms to allow private players to build satellites and launch vehicles, carry out launches, and provide space-based services. The government also created the Indian National Space Promotion and Authorisation Centre (InSpace), a new agency designed to act as a link between ISRO and the private sector, and affirmed that private companies would be able to take advantage of ISRO’s facilities.
The first launch of a private rocket from an ISRO spaceport is a major milestone for the Indian space industry, says Rajagopalan. “This step itself is pretty crucial, and it’s encouraging to other companies who are looking at this with a lot of enthusiasm and excitement,” she says. But more needs to be done to realize the government’s promised reforms, she adds. The Space Activities Bill that is designed to enshrine the country’s space policy in legislation has been languishing in draft form for years, and without regulatory clarity, it’s hard for the private sector to justify significant investments. “These are big, bold statements, but these need to be translated into actual policy and regulatory mechanisms,” says Rajagopalan.
Skyroot’s launch undoubtedly signals the growing maturity of India’s space industry, says Saurabh Kapil, associate director in PwC’s space practice. “It’s a critical message to the Indian space ecosystem, that we can do it, we have the necessary skill set, we have those engineering capabilities, we have those manufacturing or industrialization capabilities,” he says.
The Vikram-S rocket blasting off from the Satish Dhawan Space Centre, on the east coast of India.Skyroot
However, crossing this technical milestone is only part of the challenge, he says. The industry also needs to demonstrate a clear market for the kind of launch vehicles that companies like Skyroot are building. While private players are showing interest in launching small satellites for applications like agriculture and infrastructure monitoring, he says, these companies will be able to build sustainable businesses only if they are allowed to compete for more lucrative government and defense-sector contacts.
In the longer run, though, India’s space industry has ambitions of capturing a significant chunk of the global launch market, says Kapil. ISRO has already developed a reputation for both reliability and low cost—its 2014 mission to Mars cost just US $74 million, one-ninth the cost of a NASA Mars mission launched the same week. That is likely to translate to India’s private space industry, too, thanks to a considerably lower cost of skilled labor, land, and materials compared with those of other spacefaring nations, says Kapil. “The optimism is definitely there that because we are low on cost and high on reliability, whoever wants to build and launch small satellites is largely going to come to India,” he says.
Collisions with birds are a serious problem for commercial aircraft, costing the industry billions of dollars and killing thousands of animals every year. New research shows that a robotic imitation of a peregrine falcon could be an effective way to keep them out of flight paths.
Worldwide, so-called birdstrikes are estimated to cost the civil aviation industry almost US $1.4 billion annually. Nearby habitats are often deliberately made unattractive to birds, but airports also rely on a variety of deterrents designed to scare them away, such as loud pyrotechnics or speakers that play distress calls from common species.
However, the effectiveness of these approaches tends to decrease over time, as the birds get desensitized by repeated exposure, says Charlotte Hemelrijk, a professor on the faculty of science and engineering at the University of Groningen, in the Netherlands. Live hawks or blinding lasers are also sometimes used to disperse flocks, she says, but this is controversial as it can harm the animals, and keeping and training falcons is not cheap.
“The birds don’t distinguish [RobotFalcon] from a real falcon, it seems.”
—Charlotte Hemelrijk, University of Groningen
In an effort to find a more practical and lasting solution, Hemelrijk and colleagues designed a robotic peregrine falcon that can be used to chase flocks away from airports. The device is the same size and shape as a real hawk, and its fiberglass and carbon-fiber body has been painted to mimic the markings of its real-life counterpart.
Rather than flapping like a bird, the RobotFalcon relies on two small battery-powered propellers on its wings, which allows it to travel at around 30 miles per hour for up to 15 minutes at a time. A human operator controls the machine remotely from a hawk’s-eye perspective via a camera perched above the robot’s head.
To see how effective the RobotFalcon was at scaring away birds, the researchers tested it against a conventional quadcopter drone over three months of field testing, near the Dutch city of Workum. They also compared their results to 15 years of data collected by the Royal Netherlands Air Force that assessed the effectiveness of conventional deterrence methods such as pyrotechnics and distress calls.
Flock-herding Falcon Drone Patrols Airport Flight Paths youtu.be
In a paper published in the Journal of the Royal Society Interface, the team showed that the RobotFalcon cleared fields of birds faster and more effectively than the drone. It also kept birds away from fields longer than distress calls, the most effective of the conventional approaches.
There was no evidence of birds getting habituated to the RobotFalcon over three months of testing, says Hemelrijk, and the researchers also found that the birds exhibited behavior patterns associated with escaping from predators much more frequently with the robot than with the drone. “The way of reacting to the RobotFalcon is very similar to the real falcon,” says Hemelrijk. “The birds don’t distinguish it from a real falcon, it seems.”
Other attempts to use hawk-imitating robots to disperse birds have had less promising results, though. Morgan Drabik-Hamshare, a research wildlife biologist at the DoA, and her colleagues published a paper in Scientific Reports last year that described how they pitted a robotic peregrine falcon with flapping wings against a quadcopter and a fixed-wing remote-controlled aircraft.
They found the robotic falcon was the least effective of the three at scaring away turkey vultures, with the quadcopter scaring the most birds off and the remote-controlled plane eliciting the quickest response. “Despite the predator silhouette, the vultures did not perceive the predator UAS [unmanned aircraft system] as a threat,” Drabik-Hamshare wrote in an email.
Zihao Wang, an associate lecturer at the University of Sydney, in Australia, who develops UAS for bird deterrence, says the RobotFalcon does seem to be effective at dispersing flocks. But he points out that its wingspan is nearly twice the diagonal length of the quadcopter it was compared with, which means it creates a much larger silhouette when viewed from the birds’ perspective. This means the birds could be reacting more to its size than its shape, and he would like to see the RobotFalcon compared with a similar size drone in the future.
The unique design also means the robot requires an experienced and specially trained operator, Wang adds, which could make it difficult to roll out widely. A potential solution could be to make the system autonomous, he says, but it’s unclear how easy this would be.
Hemelrijk says automating the RobotFalcon is probably not feasible, both due to strict regulations around the use of autonomous drones near airports as well as the sheer technical complexity. Their current operator is a falconer with significant experience in how hawks target their prey, she says, and creating an autonomous system that could recognize and target bird flocks in a similar way would be highly challenging.
But while the need for skilled operators is a limitation, Hemelrijk points out that most airports already have full-time staff dedicated to bird deterrence, who could be trained. And given the apparent lack of habituation and the ability to chase birds in a specific direction—so that they head away from runways—she thinks the robotic falcon could be a useful addition to their arsenal.
This article appears in the February 2023 print issue as “Robotic Falcon Is the Scarecrow of the Skies.”
Planning for the return journey is an integral part of the preparations for a crewed Mars mission. Astronauts will require a total mass of about 50 tonnes of rocket propellent for the ascent vehicle that will lift them off the planet’s surface, including 31 tonnes of oxygen approximately. The less popular option is for crewed missions to carry the required oxygen themselves. But scientists are optimistic that it could instead be produced from the carbon dioxide–rich Martian atmosphere itself, using a system called MOXIE.
The Mars Oxygen ISRU (In-Situ Resource Utilization) Experiment is an 18-kilogram unit housed within the Perseverance rover on Mars. The unit is “the size of a toaster,” adds Jeffrey Hoffman, professor of aerospace engineering at MIT. Its job is to electrochemically break down carbon dioxide collected from the Martian atmosphere into oxygen and carbon monoxide. It also tests the purity of the oxygen.
Between February 2021, when it arrived on Mars aboard the Perseverance, and the end of the year, MOXIE has had several successful test runs. According to a review of the system by Hoffman and colleagues, published in Science Advances, it has demonstrated its ability to produce oxygen during both night and day, when temperatures can vary by over 100 ºC. The generation and purity rates of oxygen also meet requirements to produce rocket propellent and for breathing. The authors assert that a scaled-up version of MOXIE could produce the required oxygen for lift-off as well as for the astronauts to breathe.
Next question: How to power any oxygen-producing factories that NASA can land on Mars? Perhaps via NASA’s Kilopower fission reactors?
MOXIE is a first step toward a much larger and more complex system to support the human exploration of Mars. The researchers estimate a required generation rate of 2 to 3 kilograms per hour, compared with the current MOXIE rate of 6 to 8 grams per hour, to produce enough oxygen for lift-off for a crew arriving 26 months later. “So we’re talking about a system that’s a couple of hundred times bigger than MOXIE,” Hoffman says.
They calculate this rate accounting for eight months to get to Mars, followed by some time to set up the system. “We figure you'd probably have maybe 14 months to make all the oxygen.” Further, he says, the produced oxygen would have to be liquefied to be used a rocket propellant, something the current version of MOXIE doesn’t do.
MOXIE also currently faces several design constraints because, says Hoffman, a former astronaut, “our only ride to Mars was inside the Perseverance rover.” This limited the amount of power available to operate the unit, the amount of heat they could produce, the volume and the mass.
“MOXIE does not work nearly as efficiently as a stand-alone system that was specifically designed would,” says Hoffman. Most of the time, it’s turned off. “Every time we want to make oxygen, we have to heat it up to 800 ºC, so most of the energy goes into heating it up and running the compressor, whereas in a well-designed stand-alone system, most of the energy will go into the actual electrolysis, into actually producing the oxygen.”
However, there are still many kinks to iron out for the scaling-up process. To begin with, any oxygen-producing system will need lots of power. Hoffman thinks nuclear power is the most likely option, maybe NASA’s Kilopower fission reactors. The setup and the cabling would certainly be challenging, he says. “You’re going to have to launch to all of these nuclear reactors, and of course, they’re not going to be in exactly the same place as the [other] units,” he says. "So, robotically, you’re going to have to connect to the electrical cables to bring power to the oxygen-producing unit.”
Then there is the solid oxide electrolysis units, which Hoffman points out are carefully machined systems. Fortunately, the company that makes them, OxEon, has already designed, built, and tested a full-scale unit, a hundred times bigger than the one on MOXIE. “Several of those units would be required to produce oxygen at the quantities that we need,” Hoffman says.
He also adds that at present, there is no redundancy built into MOXIE. If any part fails, the whole system dies. “If you’re counting on a system to produce oxygen for rocket propellant and for breathing, you need very high reliability, which means you’re going to need quite a few redundant units.”
Moreover, the system has to be pretty much autonomous, Hoffman says. “It has to be able to monitor itself, run itself.” For testing purposes, every time MOXIE is powered up, there is plenty of time to plan. A full-scale MOXIE system, though, would have to run continuously, and for that it has to be able to adjust automatically to changes in the Mars atmosphere, which can vary by a factor of two over a year, and between nighttime and daytime temperature differences.
SEMrush and Ahrefs are among the most popular tools in the SEO industry. Both companies have been in business for years and have thousands of customers per month.
If you're a professional SEO or trying to do digital marketing on your own, at some point you'll likely consider using a tool to help with your efforts. Ahrefs and SEMrush are two names that will likely appear on your shortlist.
In this guide, I'm going to help you learn more about these SEO tools and how to choose the one that's best for your purposes.
What is SEMrush?
SEMrush is a popular SEO tool with a wide range of features—it's the leading competitor research service for online marketers. SEMrush's SEO Keyword Magic tool offers over 20 billion Google-approved keywords, which are constantly updated and it's the largest keyword database.
The program was developed in 2007 as SeoQuake is a small Firefox extension
Features
Ahrefs is a leading SEO platform that offers a set of tools to grow your search traffic, research your competitors, and monitor your niche. The company was founded in 2010, and it has become a popular choice among SEO tools. Ahrefs has a keyword index of over 10.3 billion keywords and offers accurate and extensive backlink data updated every 15-30 minutes and it is the world's most extensive backlink index database.
Features
Direct Comparisons: Ahrefs vs SEMrush
Now that you know a little more about each tool, let's take a look at how they compare. I'll analyze each tool to see how they differ in interfaces, keyword research resources, rank tracking, and competitor analysis.
User Interface
Ahrefs and SEMrush both offer comprehensive information and quick metrics regarding your website's SEO performance. However, Ahrefs takes a bit more of a hands-on approach to getting your account fully set up, whereas SEMrush's simpler dashboard can give you access to the data you need quickly.
In this section, we provide a brief overview of the elements found on each dashboard and highlight the ease with which you can complete tasks.
AHREFS
The Ahrefs dashboard is less cluttered than that of SEMrush, and its primary menu is at the very top of the page, with a search bar designed only for entering URLs.
Additional features of the Ahrefs platform include:
SEMRUSH
When you log into the SEMrush Tool, you will find four main modules. These include information about your domains, organic keyword analysis, ad keyword, and site traffic.
You'll also find some other options like
Both Ahrefs and SEMrush have user-friendly dashboards, but Ahrefs is less cluttered and easier to navigate. On the other hand, SEMrush offers dozens of extra tools, including access to customer support resources.
When deciding on which dashboard to use, consider what you value in the user interface, and test out both.
If you're looking to track your website's search engine ranking, rank tracking features can help. You can also use them to monitor your competitors.
Let's take a look at Ahrefs vs. SEMrush to see which tool does a better job.
The Ahrefs Rank Tracker is simpler to use. Just type in the domain name and keywords you want to analyze, and it spits out a report showing you the search engine results page (SERP) ranking for each keyword you enter.
Rank Tracker looks at the ranking performance of keywords and compares them with the top rankings for those keywords. Ahrefs also offers:
You'll see metrics that help you understand your visibility, traffic, average position, and keyword difficulty.
It gives you an idea of whether a keyword would be profitable to target or not.
SEMRush offers a tool called Position Tracking. This tool is a project tool—you must set it up as a new project. Below are a few of the most popular features of the SEMrush Position Tracking tool:
All subscribers are given regular data updates and mobile search rankings upon subscribing
The platform provides opportunities to track several SERP features, including Local tracking.
Intuitive reports allow you to track statistics for the pages on your website, as well as the keywords used in those pages.
Identify pages that may be competing with each other using the Cannibalization report.
Ahrefs is a more user-friendly option. It takes seconds to enter a domain name and keywords. From there, you can quickly decide whether to proceed with that keyword or figure out how to rank better for other keywords.
SEMrush allows you to check your mobile rankings and ranking updates daily, which is something Ahrefs does not offer. SEMrush also offers social media rankings, a tool you won't find within the Ahrefs platform. Both are good which one do you like let me know in the comment.
Keyword research is closely related to rank tracking, but it's used for deciding which keywords you plan on using for future content rather than those you use now.
When it comes to SEO, keyword research is the most important thing to consider when comparing the two platforms.
The Ahrefs Keyword Explorer provides you with thousands of keyword ideas and filters search results based on the chosen search engine.
Ahrefs supports several features, including:
SEMrush's Keyword Magic Tool has over 20 billion keywords for Google. You can type in any keyword you want, and a list of suggested keywords will appear.
The Keyword Magic Tool also lets you to:
Both of these tools offer keyword research features and allow users to break down complicated tasks into something that can be understood by beginners and advanced users alike.
If you're interested in keyword suggestions, SEMrush appears to have more keyword suggestions than Ahrefs does. It also continues to add new features, like the Keyword Gap tool and SERP Questions recommendations.
Both platforms offer competitor analysis tools, eliminating the need to come up with keywords off the top of your head. Each tool is useful for finding keywords that will be useful for your competition so you know they will be valuable to you.
Ahrefs' domain comparison tool lets you compare up to five websites (your website and four competitors) side-by-side.it also shows you how your site is ranked against others with metrics such as backlinks, domain ratings, and more.
Use the Competing Domains section to see a list of your most direct competitors, and explore how many keywords matches your competitors have.
To find more information about your competitor, you can look at the Site Explorer and Content Explorer tools and type in their URL instead of yours.
SEMrush provides a variety of insights into your competitors' marketing tactics. The platform enables you to research your competitors effectively. It also offers several resources for competitor analysis including:
Traffic Analytics helps you identify where your audience comes from, how they engage with your site, what devices visitors use to view your site, and how your audiences overlap with other websites.
SEMrush's Organic Research examines your website's major competitors and shows their organic search rankings, keywords they are ranking for, and even if they are ranking for any (SERP) features and more.
The Market Explorer search field allows you to type in a domain and lists websites or articles similar to what you entered. Market Explorer also allows users to perform in-depth data analytics on These companies and markets.
SEMrush wins here because it has more tools dedicated to competitor analysis than Ahrefs. However, Ahrefs offers a lot of functionality in this area, too. It takes a combination of both tools to gain an advantage over your competition.
When it comes to keyword data research, you will become confused about which one to choose.
Consider choosing Ahrefs if you
Consider SEMrush if you:
Both tools are great. Choose the one which meets your requirements and if you have any experience using either Ahrefs or SEMrush let me know in the comment section which works well for you.
RSS Rabbit links users to publicly available RSS entries.
Vet every link before clicking! The creators accept no responsibility for the contents of these entries.
Relevant
Fresh
Convenient
Agile
We're not prepared to take user feedback yet. Check back soon!