********** ENTERTAINMENT **********
return to top
Elliot Page on Juno, Hollywood’s dark side and coming out twice
Sat, 10 Jun 2023 06:00:14 GMT
When the feelgood movie made him an Oscar-nominated star, the strain of hiding who he was almost forced him to quit acting. He explains how opening up about being gay, then trans, saved his life
Elliot Page’s memoir is called Pageboy. At its heart is the story of his transitioning from an Oscar-nominated actress, best known for the wonderful coming-of-age comedy drama Juno, to one of the world’s most high profile trans men. He writes, rather beautifully, about gender dysphoria, top surgery and finally finding himself. But the book is so much more than a tale of transition.
Pageboy is a modern-day Hollywood Babylon, written by a sensitive soul rather than a scandalmonger. Page depicts a film industry even more rancid than we may have suspected. This is a world where it’s not only the Harvey Weinsteins at the top of the pyramid who get to abuse the young and powerless – just about everybody seems to have a go. It’s a world where most people appear to be closeted in one way or another, a world where more acting is done off set than on.Continue reading...
No one wants to see the cast naked any more, so this TV follow-up shuns stripping for comic capers and cost-of-living tragedy. Even better, it actually gives plotlines to the female characters
Television shows that remake films tend to be exercises in pointless nostalgia. Do you remember the movies Fatal Attraction, Dangerous Liaisons and American Gigolo? Yes. Would you like to watch a weird cosplay version of them that goes on for 10 hours and confusingly reshuffles the plot? Um, not really. The Full Monty (from 14 June, Disney+) is the latest entrant in an already tired genre, but it has one up on most of the competition: all the core cast are in that sweet spot where they’re successful enough to be worth rehiring but not so famous they’ve turned the reboot down. That means there’s no need to rejig the story of redundant Sheffield steelworkers who, in 1997, found solace in hard times by forming a Chippendales-style male striptease troupe. We simply return to Sheffield 26 years later, to find the same characters, played by the same actors, living the same lives.
The film had it easy, plot-wise, in that it built towards that heartwarming climactic moment when a sextet of men showed the local community their penises. Those six appendages were the pegs on which were hung serious subtexts about the misery of life in a Thatcher-ravaged, deindustrialised northern England. A quarter of a century on, however, the prospect of the old boys windmilling their hosepipes in housewives’ faces would horrify everyone. So the new Full Monty is fully clothes-on.Continue reading...
Animation has come a long way since 1900, when J. Stuart Blackton created The Enchanted Drawing, the earliest known animated film. The 90-second movie was created using stop-motion techniques, as flat characters, props, and backgrounds were drawn on an easel or made from paper.
Most modern animators rely on computer graphics and visualization techniques to create popular movies and TV shows like Finding Dory, Toy Story, and Paw Patrol. In the 1960s and ’70s, computer science pioneers David Evans and IEEE Life Member Ivan E. Sutherland led the development of many of the technologies animators now use. Their groundbreaking research, conducted at the University of Utah, in Salt Lake City, and at their company, Evans and Sutherland, helped jump-start the computer graphics industry.
A ceremony was held at the university on 24 March to recognize the computer graphics and visualization techniques with an IEEE Milestone. The IEEE Utah Section sponsored the nomination.
Computer graphics began in the 1950s with interactive games and visualization tools designed by the U.S. military to develop technologies for aviation, radar, and rocketry.
Evans and Sutherland, then computer science professors at the University of Utah, wanted to expand on the use of such tools by finding a way for computers to simulate objects and environments. In 1968 they founded Evans and Sutherland, locating the E&S headquarters in the university’s research park.
Many of today’s computer graphics luminaries—including Pixar cofounder Edwin Catmull, Adobe cofounder John Warnock, and Netscape founder Jim Clark, who also founded Silicon Graphics—got their start in the industry as E&S employees or as doctoral students working on research at the company’s facilities.
IEEE Milestone Dedication: Utah Computer Graphics youtu.be
While at E&S, the employees and students made fundamental contributions to computer graphics processes, says IEEE Fellow Christopher Johnson, a University of Utah computer science professor.
“David Evans, Ivan Sutherland, and their students and colleagues helped change the world,” Johnson says.
“The period from 1968 through 1978 was an extraordinary time for computer graphics,” adds Brian Berg, IEEE Region 6 history chair. “There was a rare confluence of faculty, students, staff, facilities, and resources to support research into computer vision algorithms and hardware that produced remarkable developments in computer graphics and visualization techniques. This research was responsible for the birth of much of continuous-tone computer graphics as we know it today.” Continuous-tone computer graphics have a virtually unlimited range of color and shades of gray.
Evans began his career in 1955 at Bendix—an aviation electronics company in Avon, Ohio—as manager of a project that aimed to develop an early personal computer. He left to join the University of California, Berkeley, as chair of its computer science department. He also headed Berkeley’s research for the Pentagon’s Advanced Research Project Agency (now known as the Defense Advanced Research Projects Agency).
In 1963 Evans became a principal investigator for ARPA’s Project Genie. He helped develop hardware techniques that enabled commercial use of time-shared computer systems.
In 1965 the University of Utah hired him to establish its computer science department after receiving an ARPA grant of US $5 million to investigate how the emerging field of computer graphics could play a role in the country’s technological competitiveness, according to Computer Graphics and Computer Animation.
In 1968 Evans asked Sutherland, a former colleague at Berkeley who was then an associate professor of electrical engineering at Harvard, to join him at the University of Utah, luring him with the promise of starting a company together. Sutherland was already famous in computer graphics circles, having created Sketchpad, the first computer-aided design program, for his Ph.D. thesis in 1963 at MIT.
The two founded E&S almost as soon as Sutherland arrived, and they began working on computer-based simulation systems.
The duo in 1969 developed the line-drawing system displays LDS-1 and LDS-2, the first graphics devices with a processing unit. They then built the E&S Picture System—the next generation of LDS displays.
Those workstations, as they were called, came to be used by most computer-generated-imagery production companies through the 1980s.
E&S also developed computer-based simulation systems for military and commercial training, including the CT5 and CT6 flight simulators.
In addition to hiring employees, E&S welcomed computer science doctoral students from the university to work on their research projects at the company.
“Almost every influential person in the modern computer-graphics community either passed through the University of Utah or came into contact with it in some way,” Robert Rivlin wrote in his book, The Algorithmic Image: Graphic Visions of the Computer Age.
One of the doctoral students was Henri Gouraud, who in 1971 developed an algorithm to simulate the differing effects of light and color across the surface of an object. The Gouraud shading method is still used by creators of video games and cartoons.
In 1974 Edwin Catmull, then also a doctoral student at the university, developed the principle of texture mapping, a method for adding complexity to a computer-generated surface. Catmull went on to help found Pixar in 1986 with computer scientist Alvy Ray Smith, an IEEE member. For his work in the industry, Catmull received the 2006 IEEE John von Neumann Medal.
Doctoral student Bui Tuong Phong in 1973 devised Phong shading, a modeling method that reflects light so computer-generated graphics can look shiny and plasticlike.
“As a group, the University of Utah contributed more to the field of knowledge in computer graphics than any of its contemporaries,” Berg wrote in the Milestone proposal. “That fact is made most apparent both in the widespread use of the techniques developed and in the body of awards the innovations garnered.” The awards include several scientific and technical Oscars, an Emmy, and many IEEE medals.
Administered by the IEEE History Center and supported by donors, the Milestone program recognizes outstanding technical developments around the world.
The Milestone plaque displayed on a granite obelisk outside of the University of Utah’s Merrill engineering building reads:
In 1965 the University of Utah established a Center of Excellence for computer graphics research with Advanced Research Projects Agency (ARPA) funding. In 1968 two professors founded the pioneering graphics hardware company Evans & Sutherland; by 1978, fundamental rendering and visualization techniques disclosed in doctoral dissertations included the Warnock algorithm, Gouraud shading, the Catmull-Rom spline, and the Blinn-Phong reflection model. Alumni-founded companies include Atari, Silicon Graphics, Adobe, Pixar, and Netscape.
A group of researchers from NASA, MIT, and other institutions have achieved the fastest space-to-ground laser-communication link yet, doubling the record they set last year. With data rates of 200 gigabits per second, a satellite could transmit more than 2 terabytes of data—roughly as much as 1,000 high-definition movies—in a single 5-minute pass over a ground station.
“The implications are far-reaching because, put simply, more data means more discoveries,” says Jason Mitchell, an aerospace engineer at NASA’s Space Communications and Navigation program.
The new communications link was made possible with the TeraByte InfraRed Delivery (TBIRD) system orbiting about 530 kilometers above Earth’s surface. Launched into space last May, TBIRD achieved downlink rates of up to 100 Gb/s with a ground-based receiver in California by last June. This was 100 times as fast as the quickest Internet speeds in most cities, and more than 1,000 times as fast as radio links traditionally used for communications with satellites.
The fastest data networks on Earth typically rely on laser communications over fiber optics. However, a high-speed laser-based Internet does not exist yet for satellites. Instead, space agencies and commercial satellite operators most commonly use radio to communicate with objects in space. The infrared light that laser communications can employ has a much higher frequency than radio waves, enabling much higher data rates.
“There are satellites currently in orbit limited by the amount of data they are able to downlink, and this trend will only increase as more capable satellites are launched,” says Kat Riesing, an aerospace engineer and a staff member at MIT Lincoln Laboratory on the TBIRD team. “Even a hyperspectral imager—HISUI on the International Space Station—has to send data back to Earth via storage drives on cargo ships due to limitations on downlink rates. TBIRD is a big enabler for missions that collect important data on Earth’s climate and resources, as well as astrophysics applications such as black hole imaging.”
MIT Lincoln Laboratory conceived TBIRD in 2014 as a low-cost, high-speed way to access data on spacecraft. A key way it reduced expenses was by using commercial, off-the-shelf components originally developed for terrestrial use. These include high-rate optical modems developed for fiber telecommunications and high-speed large-volume storage to hold data, Riesing says.
Located onboard NASA’s Pathfinder Technology Demonstrator 3 (PTD-3) satellite, TBIRD was carried into orbit on SpaceX’s Transporter-5 rideshare mission from Cape Canaveral Space Force Station in Florida on 25 May 2022. The PTD-3 satellite is a roughly 12-kilogram CubeSat about the size of two stacked cereal boxes, and its TBIRD payload is no larger than the average tissue box. “Industry’s drive to small, low-power, high-data-rate optical transceivers enabled us to achieve a compact form factor suitable even for small satellites,” Mitchell says.
“There are satellites currently in orbit limited by the amount of data they are able to downlink, and this trend will only increase as more-capable satellites are launched.” —Kat Riesing, aerospace engineer, MIT Lincoln Laboratory
The development of TBIRD faced a number of challenges. To start with, terrestrial components are not designed to survive the rigors of launching to and operating in space. For example, during a thermal test simulating the extreme temperatures the devices might face in space, the fibers in the optical signal amplifier melted.
The problem was that, when used as originally intended, the atmosphere could help cool the amplifier through convection. When tested in a vacuum, simulating space, the heat that the amplifier generated was trapped. To solve the issue, the researchers worked with the amplifier’s vendor to modify it so that it released heat through conduction instead.
In addition, laser beams from space to Earth can experience distortion from atmospheric effects and weather conditions. This can cause power loss, and in turn data loss, for the beams.
To compensate, the scientists developed their own version of automatic repeat request (ARQ), a protocol for controlling errors in data transmission over a communications link. In this arrangement, the ground terminal uses a low-rate uplink signal to let the satellite know that it has to retransmit any block of data, or frame, that has been lost or damaged. The new protocol lets the ground station tell the satellite which frames it received correctly, so the satellite knows which ones to retransmit and not waste time sending data it doesn’t have to.
Another challenge the scientists faced stemmed from how lasers form in much narrower beams than radio transmissions. For successful data transmission, these beams must be aimed precisely at their receivers. This is often accomplished by mounting the laser on a gimbal. Due to TBIRD’s small size, however, it instead maneuvers the CubeSat carrying it to point it at the ground, using any error signals it receives to correct the satellite’s orientation. This gimbal-less strategy also helped further shrink TBIRD, making it cheaper to launch.
TBIRD’s architecture can support multiple channels through wavelength separation to enable higher data rates, Riesing says. This is how TBIRD accomplished a 200-Gb/s downlink on 28 April—by using two 100-Gb/s channels, she explains. “This can scale further on a future mission if the link is designed to support it,” Riesing notes.
“Put simply, more data means more discoveries.” —Jason Mitchell, aerospace engineer, NASA
The research team’s next step is to explore where to apply this technology in upcoming missions. “This technology is particularly useful for science missions where collecting a lot of data can provide significant benefits,” Riesing says. “One mission concept that is enabled by this is the Event Horizon Explorer mission, which will extend the exciting work of the Event Horizon Telescope in imaging black holes with even higher resolution.”
The scientists also want to explore how to extend this technology to different scenarios, such as geostationary orbit, Riesing says. Moreover, Mitchell says, they are looking at ways to push TBIRD’s capabilities as far away as the moon, in order to support future missions there. The rates under consideration are in the 1- to 5-Gb/s range, which “may not seem like much of an improvement, but remember the moon is roughly 400,000 km away from Earth, which is quite a long distance to cover,” Mitchell says.
The new technology may also find use in high-speed atmospheric data links on the ground. “For example, from building to building, or across inhospitable terrain, such as from mountaintop to mountaintop, where the cost of laying fiber systems could be exorbitant,” Riesing says.
On a gin-clear December day, I’m sitting under the plexiglass bubble of a radically new kind of aircraft. It’s a little past noon at the Byron Airport in northern California; in the distance, a jagged line of wind turbines atop rolling hills marks the Altamont Pass, blades spinning lazily. Above me, a cloudless blue sky beckons.
The aircraft, called BlackFly, is unlike anything else on the planet. Built by a Palo Alto, Calif., startup called Opener, it’s an electric vertical take-off and landing (eVTOL) aircraft with stubby wings fore and aft of the pilot, each with four motors and propellers. Visually, it’s as though an aerial speedster from a 1930s pulp sci-fi story has sprung from the page.
There are a couple of hundred startups designing or flying eVTOLs. But only a dozen or so are making tiny, technologically sophisticated machines whose primary purpose is to provide exhilarating but safe flying experiences to people after relatively minimal training. And in that group, Opener has jumped out to an early lead, having built dozens of aircraft at its facilities in Palo Alto and trained more than a score of people to fly them.
My own route to the cockpit of a BlackFly was relatively straightforward. I contacted the company’s CEO, Ken Karklin, in September 2022, pitched him on the idea of a story and video, and three months later I was flying one of his aircraft.
Well, sort of flying it. My brief flight was so highly automated that I was more passenger than pilot. Nevertheless, I spent about a day and a half before the flight being trained to fly the machine manually, so that I could take control if anything went wrong. For this training, I wore a virtual-reality headset and sat in a chair that tilted and gyrated to simulate flying maneuvers. To “fly” this simulation I manipulated a joystick that was identical to the one in the cockpit of a BlackFly. Opener’s chief operating officer, Kristina L. Menton, and engineer Wyatt Warner took turns patiently explaining the operations of the vehicle and giving me challenging tasks to complete, such as hovering and performing virtual landings in a vicious crosswind.
The BlackFly is entirely controlled by that joystick, which is equipped with a trigger and also topped by a thumb switch. To take off, I squeeze the trigger while simultaneously pushing forward on the switch. The machine leaps into the air with the sound of a million bees, and with a surge of giddy elation I am climbing skyward.
Much more so than an airplane or helicopter, the BlackFly taps into archetypal human yearnings for flight, the kind represented by magic carpets, the flying cars in “The Jetsons,” and even those Mountain Banshees in the movie “Avatar.” I’ve had several unusual experiences in aircraft, including flying on NASA’s zero-gravity-simulating “Vomit Comet,” and being whisked around in a BlackFly was definitely the most absorbing and delightful. Gazing out over the Altamont Pass from an altitude of about 60 meters, I had a feeling of joyous release—from Earth’s gravity and from earthly troubles.
For technical details about the BlackFly and to learn more about its origin, go here.
The BlackFly is also a likely harbinger of things to come. Most of the startups developing eVTOLs are building vehicles meant to carry several passengers on commercial runs of less than 50 kilometers. Although the plan is for these to be flown by pilots initially, most of the companies anticipate a day when the flights will be completely automated. So specialized aircraft such as the BlackFly—designed to be registered and operated as “ultralight” aircraft under aviation regulations—could provide mountains of invaluable data on highly and fully automated flying and perhaps even help familiarize people with the idea of flying without a pilot. Indeed, during my flight, dozens of sensors gathered gigabytes of data, to add to the large reservoir Opener has already collected during many hundreds of test flights so far.
As of late February 2023, Opener hadn’t yet announced a retail price or an official commercial release date for the aircraft, which has been under development and testing for more than a decade. I’ll be keeping an eye out for further news of the company. Long after my flight was over I was still savoring the experience, and hoping for another one.
Special thanks to IEEE.tv for collaborating on production of this video.
Non-fungible tokens (NFTs) are the most popular digital assets today, capturing the attention of cryptocurrency investors, whales and people from around the world. People find it amazing that some users spend thousands or millions of dollars on a single NFT-based image of a monkey or other token, but you can simply take a screenshot for free. So here we share some freuently asked question about NFTs.
NFT stands for non-fungible token, which is a cryptographic token on a blockchain with unique identification codes that distinguish it from other tokens. NFTs are unique and not interchangeable, which means no two NFTs are the same. NFTs can be a unique artwork, GIF, Images, videos, Audio album. in-game items, collectibles etc.
A blockchain is a distributed digital ledger that allows for the secure storage of data. By recording any kind of information—such as bank account transactions, the ownership of Non-Fungible Tokens (NFTs), or Decentralized Finance (DeFi) smart contracts—in one place, and distributing it to many different computers, blockchains ensure that data can’t be manipulated without everyone in the system being aware.
The value of an NFT comes from its ability to be traded freely and securely on the blockchain, which is not possible with other current digital ownership solutionsThe NFT points to its location on the blockchain, but doesn’t necessarily contain the digital property. For example, if you replace one bitcoin with another, you will still have the same thing. If you buy a non-fungible item, such as a movie ticket, it is impossible to replace it with any other movie ticket because each ticket is unique to a specific time and place.
One of the unique characteristics of non-fungible tokens (NFTs) is that they can be tokenised to create a digital certificate of ownership that can be bought, sold and traded on the blockchain.
As with crypto-currency, records of who owns what are stored on a ledger that is maintained by thousands of computers around the world. These records can’t be forged because the whole system operates on an open-source network.
NFTs also contain smart contracts—small computer programs that run on the blockchain—that give the artist, for example, a cut of any future sale of the token.
Non-fungible tokens (NFTs) aren't cryptocurrencies, but they do use blockchain technology. Many NFTs are based on Ethereum, where the blockchain serves as a ledger for all the transactions related to said NFT and the properties it represents.5) How to make an NFT?
Anyone can create an NFT. All you need is a digital wallet, some ethereum tokens and a connection to an NFT marketplace where you’ll be able to upload and sell your creations
When you purchase a stock in NFT, that purchase is recorded on the blockchain—the bitcoin ledger of transactions—and that entry acts as your proof of ownership.
The value of an NFT varies a lot based on the digital asset up for grabs. People use NFTs to trade and sell digital art, so when creating an NFT, you should consider the popularity of your digital artwork along with historical statistics.
In the year 2021, a digital artist called Pak created an artwork called The Merge. It was sold on the Nifty Gateway NFT market for $91.8 million.
Non-fungible tokens can be used in investment opportunities. One can purchase an NFT and resell it at a profit. Certain NFT marketplaces let sellers of NFTs keep a percentage of the profits from sales of the assets they create.
Many people want to buy NFTs because it lets them support the arts and own something cool from their favorite musicians, brands, and celebrities. NFTs also give artists an opportunity to program in continual royalties if someone buys their work. Galleries see this as a way to reach new buyers interested in art.
There are many places to buy digital assets, like opensea and their policies vary. On top shot, for instance, you sign up for a waitlist that can be thousands of people long. When a digital asset goes on sale, you are occasionally chosen to purchase it.
To mint an NFT token, you must pay some amount of gas fee to process the transaction on the Etherum blockchain, but you can mint your NFT on a different blockchain called Polygon to avoid paying gas fees. This option is available on OpenSea and this simply denotes that your NFT will only be able to trade using Polygon's blockchain and not Etherum's blockchain. Mintable allows you to mint NFTs for free without paying any gas fees.
The answer is no. Non-Fungible Tokens are minted on the blockchain using cryptocurrencies such as Etherum, Solana, Polygon, and so on. Once a Non-Fungible Token is minted, the transaction is recorded on the blockchain and the contract or license is awarded to whoever has that Non-Fungible Token in their wallet.
You can sell your work and creations by attaching a license to it on the blockchain, where its ownership can be transferred. This lets you get exposure without losing full ownership of your work. Some of the most successful projects include Cryptopunks, Bored Ape Yatch Club NFTs, SandBox, World of Women and so on. These NFT projects have gained popularity globally and are owned by celebrities and other successful entrepreneurs. Owning one of these NFTs gives you an automatic ticket to exclusive business meetings and life-changing connections.
That’s a wrap. Hope you guys found this article enlightening. I just answer some question with my limited knowledge about NFTs. If you have any questions or suggestions, feel free to drop them in the comment section below. Also I have a question for you, Is bitcoin an NFTs? let me know in The comment section below
Researchers used a model to predict how the smoke would move through the region and said it wouldn’t pose a health risk
Smoke from Canadian wildfires that has descended upon parts of the eastern US and Canada in a thick haze has drifted over Norway and is expected to hit southern Europe, Norwegian officials said on Friday.
Using a climate forecast model, atmosphere and climate scientists with the Norwegian climate and environmental research institute (NILU) predicted how the smoke would travel through the atmosphere, flowing over the Scandinavian country before moving further south. The smoke was not expected to pose a health risk there.Continue reading...
Air pollution in New York, the collapse of the Nova Kakhovka dam in Ukraine, protests in Colombo and Novak Djokovic at the French Open in Paris: the most striking images this weekContinue reading...
Satellite images captured from the International Space Station on Wednesday showed smoke from Canada's raging wildfires spreading to the US. The massive cloud of smoke was seen moving across Lake Superior, in the Great Lakes region, passing over Lake Huron and Lake Erie, and ending in Pennsylvania, which appears completely obscured. The smoke pushed further down the Atlantic seaboard on Thursday, blanketing Washington DC in an unhealthy hazeContinue reading...
Swift investment would make any Labour government a climate and economic leader – so why the dithering?
As wildfire smoke engulfs much of the east coast of the US and average global temperatures continue to rise, with the world imminently facing some of the hottest years on record, it would be an error of judgment for the Labour party to delay its green investment pledge. Doing so would not only be a mistake for our economy and the climate, but also threaten Labour’s electoral prospects, given strong public demand for bold action on this issue.
Together with its world-leading promise to end all new domestic oil and gas developments, the Labour party’s £28bn-a-year investment pledge to green industries marks the scale of climate ambition we need to see from a future British government. These commitments mark Labour out as a potential major climate leader and, like Joe Biden’s landmark Inflation Reduction Act (IRA), the investment pledge clearly demonstrates that the party is in tune with the economic realities of today’s world.
Rebecca Newsom is head of politics at GreenpeaceContinue reading...
Readers respond to Rowan Atkinson’s growing disillusionment with electric vehicles
Andrew Gould’s letter (4 June) highlights one flaw in Rowan Atkinson’s critique of electric cars (I love electric vehicles – and was an early adopter. But increasingly I feel duped, 3 June). Another serious flaw was to suggest it would be “sensible” to use electricity to produce synthetic fuels for petrol engines, rather than use electric cars.
This would be highly inefficient. A Guardian article last month (E-fuels: how big a niche can they carve out for cars?, 5 May) noted that only about 16% of the electricity used to produce synthetic fuels ends up in car-propulsion, compared with 77% for a battery-electric vehicle. To put this another way: the electricity needed to run one petrol car on synthetic fuel could run nearly five equivalent electric cars.Continue reading...
From sustainable fisheries to toxic battery waste, these images were chosen because they tell a compelling story about the state of our planetContinue reading...
Poland has a deep and historic relationship with coal, importing huge amounts despite producing yet more locally. With the energy crisis biting, fuelled by the war in Ukraine, the country’s government withdrew restrictions on burning materials and subsidised coal, creating huge air quality issues, particularly in the industrial south – reversing 10 years of hard work by air pollution campaigners in the process.
The Guardian visits southern Poland to witness first hand the impact of this decision on affected communities, meeting the ostracised miners at the front of the culture wars, and joining climate activists visiting towns in the region that are fighting back against fossil fuels and air pollutionContinue reading...
For about as long as engineers have talked about beaming solar power to Earth from space, they’ve had to caution that it was an idea unlikely to become real anytime soon. Elaborate designs for orbiting solar farms have circulated for decades—but since photovoltaic cells were inefficient, any arrays would need to be the size of cities. The plans got no closer to space than the upper shelves of libraries.
That’s beginning to change. Right now, in a sun-synchronous orbit about 525 kilometers overhead, there is a small experimental satellite called the Space Solar Power Demonstrator One (SSPD-1 for short). It was designed and built by a team at the California Institute of Technology, funded by donations from the California real estate developer Donald Bren, and launched on 3 January—among 113 other small payloads—on a SpaceX Falcon 9 rocket.
“To the best of our knowledge, this would be the first demonstration of actual power transfer in space, of wireless power transfer,” says Ali Hajimiri, a professor of electrical engineering at Caltech and a codirector of the program behind SSPD-1, the Space Solar Power Project.
The Caltech team is waiting for a go-ahead from the operators of a small space tug to which it is attached, providing guidance and attitude control. If all goes well, SSPD-1 will spend at least five to six months testing prototype components of possible future solar stations in space. In the next few weeks, the project managers hope to unfold a lightweight frame, called DOLCE (short for Deployable on-Orbit ultraLight Composite Experiment), on which parts of future solar arrays could be mounted. Another small assembly on the spacecraft contains samples of 32 different types of photovoltaic cells, intended to see which would be most efficient and robust. A third part of the vehicle contains a microwave transmitter, set up to prove that energy from the solar cells can be sent to a receiver. For this first experiment, the receivers are right there on board the spacecraft, but if it works, an obvious future step would be to send electricity via microwave to receivers on the ground.
Caltech’s Space Solar Power Demonstrator, shown orbiting Earth in this artist’s conception, was launched on 3 January.Caltech
One can dismiss the 50-kilogram SSPD-1 as yet another nonstarter, but a growing army of engineers and policymakers take solar energy from space seriously. Airbus, the European aerospace company, has been testing its own technology on the ground, and government agencies in China, Japan, South Korea, and the United States have all mounted small projects. “Recent technology and conceptual advances have made the concept both viable and economically competitive,” said Frazer-Nash, a British engineering consultancy, in a 2021 report to the U.K. government. Engineers working on the technology say microwave power transmissions would be safe, unlike ionizing radiation, which is harmful to people or other things in its path.
No single thing has happened to start this renaissance. Instead, say engineers, several advances are coming together.
For one thing, the cost of launching hardware into orbit keeps dropping, led by SpaceX and other, smaller companies such as Rocket Lab. SpaceX has a simplified calculator on its website, showing that if you want to launch a 50-kg satellite into sun-synchronous orbit, they’ll do it for US $275,000.
Meanwhile, photovoltaic technology has improved, step by step. Lightweight electronic components keep getting better and cheaper. And there is political pressure as well: Governments and major companies have made commitments to decarbonize in the battle against global climate change, committing to renewable energy sources to replace fossil fuels.
Most solar power, at least for the foreseeable future, will be Earth-based, which will be cheaper and easier to maintain than anything anyone can launch into space. Proponents of space-based solar power say that for now, they see it as best used for specialty needs, such as remote outposts, places recovering from disasters, or even other space vehicles.
But Hajimiri says don’t underestimate the advantages of space, such as unfiltered sunlight that is far stronger than what reaches the ground and is uninterrupted by darkness or bad weather—if you can build an orbiting array light enough to be practical.
Most past designs, dictated by the technology of their times, included impossibly large truss structures to hold solar panels and wiring to route power to a central transmitter. The Caltech team would dispense with all that. An array would consist of thousands of independent tiles as small as 100 square centimeters, each with its own solar cells, transmitter, and avionics. They might be loosely connected, or they might even fly in formation.
Time-lapse images show the experimental DOLCE frame for an orbiting solar array being unfolded in a clean room.Caltech
“The analogy I like to use is that it’s like an army of ants instead of an elephant,” says Hajimiri. Transmission to receivers on the ground could be by phased array—microwave signals from the tiles synchronized so that they can be aimed with no moving parts. And the parts—the photovoltaic cells with their electronics—could perhaps be so lightweight that they’re flexible. New algorithms could keep their signals focused.
“That’s the kind of thing we’re talking about,” said Harry Atwater, a coleader of the Caltech project, as SSPD-1 was being planned. “Really gossamer-like, ultralight, the limits of mass-density deployable systems.”
If it works out, in 30 years maybe there could be orbiting solar power fleets, adding to the world’s energy mix. In other words, as a recent report from Frazer-Nash concluded, this is “a potential game changer.”
This article appears in the April 2023 print issue as “Trial Run for Orbiting Solar Array.”
At IEEE, we know that the advancement of science and technology is the engine that drives the improvement of the quality of life for every person on this planet. Unfortunately, as we are all aware, today’s world faces significant challenges, including escalating conflicts, a climate crisis, food insecurity, gender inequality, and the approximately 2.7 billion people who cannot access the Internet.
The COVID-19 pandemic exposed the digital divide like never before. The world saw the need for universal broadband connectivity for remote work, online education, telemedicine, entertainment, and social networking. Those who had access thrived while those without it struggled. As millions of classrooms moved online, the lack of connectivity made it difficult for some students to participate in remote learning. Adults who could not perform their job virtually faced layoffs or reduced work hours.
The pandemic also exposed weaknesses in the global infrastructure that supports the citizens of the world. It became even more apparent that vital communications, computing, energy, and distribution infrastructure was not always equitably distributed, particularly in less developed regions.
I had the pleasure of presenting the 2023 IEEE President’s Award to Doreen Bogdan-Martin, secretary-general of the International Telecommunication Union, on 28 March, at ITU’s headquarters in Geneva. The award recognizes her distinguished leadership at the agency and her notable contributions to the global public.
It is my honor to recognize such a transformational leader and IEEE member for her demonstrated commitment to bridging the digital divide and to ensuring connectivity that is safe, inclusive, and affordable to all.
Nearly 45 percent of global households do not have access to the Internet, according to UNESCO. A report from UNICEF estimates that nearly two-thirds of the world’s schoolchildren lack Internet access at home.
This digital divide is particularly impactful on women. who are 23 percent less likely than men to use the Internet. According to the United Nations Educational, Scientific and Cultural Organization, in 10 countries across Africa, Asia, and South America, women are between 30 percent and 50 percent less likely than men to make use of the Internet.
Even in developed countries, Internet access is often lower than one might imagine. More than six percent of the U.S. population does not have a high-speed connection. In Australia, the figure is 13 percent. Globally, just over half of households have an Internet connection, according to UNESCO. In the developed world, 87 percent are connected, compared with 47 percent in developing nations and just 19 percent in the least developed countries.
As IEEE looks to lead the development of technology to tackle climate change and empower universal prosperity, it is essential that we recognize the role that meaningful connectivity and digital technology play in the organization’s goals to support global sustainability, drive economic growth, and transform health care, education, employment, gender equality, and youth empowerment.
IEEE members around the globe are continuously developing and applying technology to help solve these problems. It is that universal passion—to improve global conditions—that is at the heart of our mission, as well as our expanding partnerships and significant activities supporting the achievement of the U.N. Sustainable Development Goals.
One growing partnership is with the International Telecommunication Union, a U.N. specialized agency that helps set policy related to information and communication technologies. IEEE Member Doreen Bogdan-Martin was elected as ITU secretary-general and took office on 1 January, becoming the first woman to lead the 155-year-old organization. Bogdan-Martin is the recipient of this year’s IEEE President’s Award [see sidebar].
IEEE and ITU share the goal of bringing the benefits of technology to all of humanity. I look forward to working closely with the U.N. agency to promote meaningful connectivity, intensify cooperation to connect the unconnected, and strengthen the alignment of digital technologies with inclusive sustainable development.
I truly believe that one of the most important applications of technology is to improve people’s lives. For those in underserved regions of the world, technology can improve educational opportunities, provide better health care, alleviate suffering, and maintain human dignity.
Technology and technologists, particularly IEEE members, have a significant role to play in shaping life on this planet. They can use their skills to develop and advance technology—from green energy to reducing waste and emissions, and from transportation electrification to digital education, health, and agriculture. As a person who believes in the power of technology to benefit humanity, I find this to be a very compelling vision for our shared future.
Please share your thoughts with me: firstname.lastname@example.org.
IEEE president and CEO
This article appears in the June 2023 print issue as “Connecting the Unconnected.”
Here are some answers about the new social media network Bluesky that you don’t need an invite to see.
The post Is Bluesky Billionaire-Proof? appeared first on The Intercept.
Everything burns. Given the right environment, all matter can burn by adding oxygen, but finding the right mix and generating enough heat makes some materials combust more easily than others. Researchers interested in knowing more about a type of fire called discrete burning used ESA’s microgravity experiment facilities to investigate.
The 19-seater Dornier 228 propeller plane that took off into the cold blue January sky looked ordinary at first glance. Spinning its left propeller, however, was a 2-megawatt electric motor powered by two hydrogen fuel cells—the right side ran on a standard kerosene engine—making it the largest aircraft flown on hydrogen to date. Val Miftakhov, founder and CEO of ZeroAvia, the California startup behind the 10-minute test flight in Gloucestershire, England, called it a “historical day for sustainable aviation.”
Los Angeles–based Universal Hydrogen plans to test a 50-seat hydrogen-powered aircraft by the end of February. Both companies promise commercial flights of retrofitted turboprop aircraft by 2025. French aviation giant Airbus is going bigger with a planned 2026 demonstration flight of its iconic A380 passenger airplane, which will fly using hydrogen fuel cells and by burning hydrogen directly in an engine. And Rolls Royce is making headway on aircraft engines that burn pure hydrogen.
The aviation industry, responsible for some 2.5 percent of global carbon emissions, has committed to net-zero emissions by 2050. Getting there will require several routes, including sustainable fuels, hybrid-electric engines, and battery-electric aircraft.
Hydrogen is another potential route. Whether used to make electricity in fuel cells or burned in an engine, it combines with oxygen to emit water vapor. If green hydrogen scales up for trucks and ships, it could be a low-cost fuel without the environmental issues of batteries.
Flying on hydrogen brings storage and aircraft-certification challenges, but aviation companies are doing the groundwork now for hydrogen flight by 2035. “Hydrogen is headed off to the sky, and we’re going to take it there,” says Amanda Simpson, vice president for research and technology at Airbus Americas.
The most plentiful element, hydrogen is also the lightest—key for an industry fighting gravity—packing three times the energy of jet fuel by weight. The problem with hydrogen is its volume. For transport, it has to be stored in heavy tanks either as a compressed high-pressure gas or a cryogenic liquid.
ZeroAvia is using compressed hydrogen gas, since it is already approved for road transport. Its test airplane had two hydrogen fuel cells and tanks sitting inside the cabin, but the team is now thinking creatively about a compact system with minimal changes to aircraft design to speed up certification in the United States and Europe. The fuel cells’ added weight could reduce flying range, but “that’s not a problem, because aircraft are designed to fly much further than they’re used,” says vice president of strategy James McMicking.
The company has backing from investors that include Bill Gates and Jeff Bezos; partnerships with British Airways and United Airlines; and 1,500 preorders for its hydrogen-electric power-train system, half of which are for smaller, 400-kilometer-range 9- to 19-seaters.
By 2027, ZeroAvia plans to convert larger, 70-seater turboprop aircraft with twice the range, used widely in Europe. The company is developing 5-MW electric motors for those, and it plans to switch to more energy-dense liquid hydrogen to save space and weight. The fuel is novel for the aviation industry and could require a longer regulatory approval process, McMicking says.
Next will come a 10-MW power train for aircraft with 100 to 150 seats, “the workhorses of the industry,” he says. Those planes—think Boeing 737—are responsible for 60 percent of aviation emissions. Making a dent in those with hydrogen will require much more efficient fuel cells. ZeroAvia is working on proprietary high-temperature fuel cells for that, McMicking says, with the ability to reuse the large amounts of waste heat generated. “We have designs and a technology road map that takes us into jet-engine territory for power,” he says.
Universal Hydrogen, which counts Airbus, GE Aviation, and American Airlines among its strategic investors, is placing bets on liquid hydrogen. The startup, “a hydrogen supply and logistics company at our core,” wants to ensure a seamless delivery network for hydrogen aviation as it catches speed, says founder and CEO Paul Eremenko. The company sources green hydrogen, turns it into liquid, and puts it in relatively low-tech insulated aluminum tanks that it will deliver via road, rail, or ship. “We want them certified by the Federal Aviation Administration for 2025, which means they can’t be a science project,” he says.
The cost of green hydrogen is expected to be on par with kerosene by 2025, Eremenko says. But “there’s nobody out there with an incredible hydrogen-airplane solution. It’s a chicken-and-egg problem.”
To crack it, Universal Hydrogen partnered with leading fuel-cell-maker Plug Power to develop a few thousand conversion kits for regional turboprop airplanes. The kits swap the engine in its streamlined housing (also known as nacelle) for a fuel-cell stack, power electronics, and a 2-MW electric motor. While the company’s competitors use batteries as buffers during takeoff, Eremenko says Universal uses smart algorithms to manage fuel cells, so they can ramp up and respond quickly. “We are the Nespresso of hydrogen,” he says. “We buy other people’s coffee, put it into capsules, and deliver to customers. But we have to build the first coffee machine. We’re the only company incubating the chicken and egg at the same time.”
This rendering of an Airbus A380 demonstrator flight (presently slated for 2026) reveals current designs on an aircraft that’s expected to fly using fuel cells and by burning hydrogen directly in the engine. Airbus
Fuel cells have a few advantages over a large central engine. They allow manufacturers to spread out smaller propulsion motors over an aircraft, giving them more design freedom. And because there are no high-temperature moving parts, maintenance costs can be lower. For long-haul aircraft, however, the weight and complexity of high-power fuel cells makes hydrogen-combustion engines appealing.
Airbus is considering both fuel-cell and combustion propulsion for its ZEROe hydrogen aircraft system. It has partnered with German automotive fuel-cell-maker Elring Klinger and, for direct combustion engines, with CFM International, a joint venture between GE Aviation and Safran. Burning liquid hydrogen in today’s engines is still expected to require slight modifications, such as a shorter combustion chamber and better seals.
Airbus is also evaluating hybrid propulsion concepts with a hydrogen-engine-powered turbine and a hydrogen-fuel-cell-powered motor on the same shaft, says Simpson, of Airbus Americas. “Then you can optimize it so you use both propulsion systems for takeoff and climb, and then turn one off for cruising.”
The company isn’t limiting itself to simple aircraft redesign. Hydrogen tanks could be stored in a cupola on top of the plane, pods under the wings, or a large tank at the back, Simpson says. Without liquid fuel in the wings, as in traditional airplanes, she says, “you can optimize wings for aerodynamics, make them thinner or longer. Or maybe a blended-wing body, which could be very different. This opens up the opportunity to optimize aircraft for efficiency.” Certification for such new aircraft could take years, and Airbus isn’t expecting commercial flights until 2035.
Conventional aircraft made today will be around in 2050 given their 25- to 30-year life-span, says Robin Riedel, an analyst at McKinsey & Co. Sustainable fuels are the only green option for those. He says hydrogen could play a role there, through “power-to-liquid technology, where you can mix hydrogen and captured carbon dioxide to make aviation fuel.”
Even then, Riedel thinks hydrogen will likely be a small part of aviation’s sustainability solution until 2050. “By 2070, hydrogen is going to play a much bigger role,” he says. “But we have to get started on hydrogen now.” The money that Airbus and Boeing are putting into hydrogen is a small fraction of aerospace, he says, but big airlines investing in hydrogen companies or placing power-train orders “shows there is desire.”
The aviation industry has to clean up if it is to grow, Simpson says. Biofuels are a stepping-stone, because they reduce only carbon emissions, not other harmful ones. “If we’re going to move towards clean aviation, we have to rethink everything from scratch and that’s what ZEROe is doing,” she says. “This is an opportunity to make not an evolutionary change but a truly revolutionary one.”
This article appears in the April 2023 print issue as “Hydrogen-Powered Flight Cleared for Takeoff.”
This sponsored article is brought to you by COMSOL.
History teaches that the Industrial Revolution began in England in the mid-18th century. While that era of sooty foundries and mills is long past, manufacturing remains essential — and challenging. One promising way to meet modern industrial challenges is by using additive manufacturing (AM) processes, such as powder bed fusion and other emerging techniques. To fulfill its promise of rapid, precise, and customizable production, AM demands more than just a retooling of factory equipment; it also calls for new approaches to factory operation and management.
That is why Britain’s Manufacturing Technology Centre (MTC) has enhanced its in-house metal powder bed fusion AM facility with a simulation model and app to help factory staff make informed decisions about its operation. The app, built using the Application Builder in the COMSOL Multiphysics software, shows the potential for pairing a full-scale AM factory with a so-called “digital twin” of itself.
“The model helps predict how heat and humidity inside a powder bed fusion factory may affect product quality and worker safety,” says Adam Holloway, a technology manager within the MTC’s modeling team. “When combined with data feeds from our facility, the app helps us integrate predictive modeling into day-to-day decision-making.” The MTC project demonstrates the benefits of placing simulation directly into the hands of today’s industrial workforce and shows how simulation could help shape the future of manufacturing.
“We’re trying to present the findings of some very complex calculations in a simple-to-understand way. By creating an app from our model, we can empower staff to run predictive simulations on laptops during their daily shifts.”
—Adam Holloway, MTC Technology Manager
To help modern British factories keep pace with the world, the MTC promotes high-value manufacturing throughout the United Kingdom. The MTC is based in the historic English industrial city of Coventry (Figure 2), but its focus is solely on the future. That is why the team has committed significant human and technical resources to its National Centre for Additive Manufacturing (NCAM).
“Adopting AM is not just about installing new equipment. Our clients are also seeking help with implementing the digital infrastructure that supports AM factory operations,” says Holloway. “Along with enterprise software and data connectivity, we’re exploring how to embed simulation within their systems as well.”
The NCAM’s Digital Reconfigurable Additive Manufacturing for Aerospace (DRAMA) project provides a valuable venue for this exploration. Developed in concert with numerous manufacturers, the DRAMA initiative includes the new powder bed fusion AM facility mentioned previously. With that mini factory as DRAMA’s stage, Holloway and his fellow simulation specialists play important roles in making its production of AM aerospace components a success.
What makes a manufacturing process “additive”, and why are so many industries exploring AM methods? In the broadest sense, an additive process is one where objects are created by adding material layer by layer, rather than removing it or molding it. A reductive or subtractive process for producing a part may, for example, begin with a solid block of metal that is then cut, drilled, and ground into shape. An additive method for making the same part, by contrast, begins with empty space! Loose or soft material is then added to that space (under carefully controlled conditions) until it forms the desired shape. That pliable material must then be solidified into a durable finished part.
Different materials demand different methods for generating and solidifying additive forms. For example, common 3D printers sold to consumers produce objects by unspooling warm plastic filament, which bonds to itself and becomes harder as it cools. By contrast, the metal powder bed fusion process (Ref. 1) begins with, as its name suggests, a powdered metal which is then melted by applied heat and re-solidified when it cools. A part produced via the metal powder bed fusion process can be seen in Figure 3.
“The market opportunities for AM methods have been understood for a long time, but there have been many obstacles to large-scale adoption,” Holloway says. “Some of these obstacles can be overcome during the design phase of products and AM facilities. Other issues, such as the impact of environmental conditions on AM production, must be addressed while the facility is operating.”
For instance, maintaining careful control of heat and humidity is an essential task for the DRAMA team. “The metal powder used for the powder bed fusion process (Figure 4) is highly sensitive to external conditions,” says Holloway. “This means it can begin to oxidize and pick up ambient moisture even while it sits in storage, and those processes will continue as it moves through the facility. Exposure to heat and moisture will change how it flows, how it melts, how it picks up an electric charge, and how it solidifies,” he says. “All of these factors can affect the resulting quality of the parts you’re producing.”
Careless handling of powdered metal is not just a threat to product quality. It can threaten the health and safety of workers as well. “The metal powder used for AM processes is flammable and toxic, and as it dries out, it becomes even more flammable,” Holloway says. “We need to continuously measure and manage humidity levels, as well as how loose powder propagates throughout the facility.”
To maintain proper atmospheric conditions, a manufacturer could augment its factory’s ventilation with a full climate control system, but that could be prohibitively expensive. The NCAM estimated that it would cost nearly half a million English pounds to add climate control to its relatively modest facility. But what if they could adequately manage heat and humidity without adding such a complicated system?
Perhaps using multiphysics simulation for careful process management could provide a cost-effective alternative. “As part of the DRAMA program, we created a model of our facility using the computational fluid dynamics (CFD) capabilities of the COMSOL software. Our model (Figure 5) uses the finite element method to solve partial differential equations describing heat transfer and fluid flow across the air domain in our facility,” says Holloway. “This enabled us to study how environmental conditions would be affected by multiple variables, from the weather outside, to the number of machines operating, to the way machines were positioned inside the shop. A model that accounts for those variables helps factory staff adjust ventilation and production schedules to optimize conditions,” he explains.
The DRAMA team made their model more accessible by building a simulation app of it with the Application Builder in COMSOL Multiphysics (Figure 6). “We’re trying to present the findings of some very complex calculations in a simple-to-understand way,” Holloway explains. “By creating an app from our model, we can empower staff to run predictive simulations on laptops during their daily shifts.”
The app user can define relevant boundary conditions for the beginning of a factory shift and then make ongoing adjustments. Over the course of a shift, heat and humidity levels will inevitably fluctuate. Perhaps factory staff should alter the production schedule to maintain part quality, or maybe they just need to open doors and windows to improve ventilation. Users can change settings in the app to test the possible effects of actions like these. For example, Figure 8 presents isothermal surface plots that show the effect that opening the AM machines’ build chambers has on air temperature, while Figure 9 shows how airflow is affected by opening the facility doors.
While the current app is an important step forward, it does still require workers to manually input relevant data. Looking ahead, the DRAMA team envisions something more integral, and therefore, more powerful: a “digital twin” for its AM facility. A digital twin, as described by Ed Fontes in a 2019 post on the COMSOL Blog (Ref. 2), is “a dynamic, continuously updated representation of a real physical product, device, or process.” It is important to note that even the most detailed model of a system is not necessarily its digital twin.
“To make our factory environment model a digital twin, we’d first provide it with ongoing live data from the actual factory,” Holloway explains. “Once our factory model was running in the background, it could adjust its forecasts in response to its data feeds and suggest specific actions based on those forecasts.”
“We want to integrate our predictive model into a feedback loop that includes the actual factory and its staff. The goal is to have a holistic system that responds to current factory conditions, uses simulation to make predictions about future conditions, and seamlessly makes self-optimizing adjustments based on those predictions,” Holloway says. “Then we could truly say we’ve built a digital twin for our factory.”
As an intermediate step toward building a full factory-level digital twin, the DRAMA simulation app has already proven its worth. “Our manufacturing partners may already see how modeling can help with planning an AM facility, but not really understand how it can help with operation,” Holloway says. “We’re showing the value of enabling a line worker to open up the app, enter in a few readings or import sensor data, and then quickly get a meaningful forecast of how a batch of powder will behave that day.”
Beyond its practical insights for manufacturers, the overall project may offer a broader lesson as well: By pairing its production line with a dynamic simulation model, the DRAMA project has made the entire operation safer, more productive, and more efficient. The DRAMA team has achieved this by deploying the model where it can do the most good — into the hands of the people working on the factory floor.
At Moffett Field in Mountain View, Calif., Lighter Than Air (LTA) Research is floating a new approach to a technology that saw its rise and fall a century ago: airships. Although airships have long since been supplanted by planes, LTA, which was founded in 2015 by CEO Alan Weston, believes that through a combination of new materials, better construction techniques, and technological advancements, airships are poised to—not reclaim the skies, certainly—but find a new niche.
Although airships never died off entirely—the Goodyear blimps, familiar to sports fans, are proof of that—the industry was already in decline by 1937, the year of the Hindenburg disaster. By the end of World War II, airships couldn’t compete with the speed airplanes offered, and they required larger crews. Today, what airships still linger serve primarily for advertising and sightseeing.
LTA’s Pathfinder 1 carries bigger dreams than hovering over a sports stadium, however. The company sees a natural fit for airships in humanitarian and relief missions. Airships can stay aloft for long periods of time, in case ground conditions aren’t ideal, have a long range, and carry significant payloads, according to Carl Taussig, LTA’s chief technical officer.
Pathfinder’s cigar-shaped envelope is just over 120 meters in length and 20 meters in diameter. While that dwarfs Goodyear’s current, 75-meter Wingfoot One, it’s still only half the length of the Hindenburg. LTA expects Pathfinder 1 to carry approximately 4 tonnes of cargo, in addition to its crew, water ballast, and fuel. The airship will have a top speed of 65 knots, or about 120 kilometers per hour—on par with the Hindenburg—with a sustained cruise speed of 35 to 40 knots (65 to 75 km/h).
It may not seem much of an advance to be building an airship that flies no faster than the Hindenburg. But Pathfinder 1 carries a lot of new tech that LTA is betting will prove key to an airship resurgence.
For one, airships used to be constructed around riveted aluminum girders, which provided the highest strength-to-weight ratio available at the time. Instead, LTA will be using carbon-fiber tubes attached to titanium hubs. As a result, Pathfinder 1’s primary structure will be both stronger and lighter.
Pathfinder 1’s outer covering is also a step up from past generations. Airships like the 1930s’ Graf Zeppelin had coverings made out of doped cotton canvas. The dope painted on the fabric increased its strength and resiliency. But canvas is still canvas. LTA has instead built its outer coverings out of a three-layer laminate of synthetics. The outermost layer is DuPont’s Tedlar, which is a polyvinyl fluoride. The middle layer is a loose weave of fire-retardant aramid fibers. The inner layer is polyester. “It’s very similar to what’s used in a lot of racing sailboats,” says Taussig. “We needed to modify that material to make it fire resistant and change a little bit about its structural performance.”
But neither the materials science nor the manufacturing advances will take primary credit for LTA’s looked-for success, according to Taussig—instead, it’s the introduction of electronics. “Everything’s electric on Pathfinder,” he says. “All the actuation, all the propulsion, all the actual power is all electrically generated. It’s a fully electric fly-by-wire aircraft, which is not something that was possible 80 years ago.” Pathfinder 1 has 12 electric motors for propulsion, as well as four tail fins with steering rudders controlled by its fly-by-wire system. (During initial test flights, the airship will be powered by two reciprocating aircraft engines).
There’s one other piece of equipment making an appearance on Pathfinder 1 that wasn’t available 80 years ago: lidar. Installed at the top of each of Pathfinder 1’s helium gas cells is an automotive-grade lidar. “The lidar can give us a point cloud showing the entire internal hull of that gas cell,” says Taussig, which can then be used to determine the gas cell’s volume accurately. In flight, the airship’s pilots can use that information, as well as data about the helium’s purity, pressure, and temperature, to better keep the craft pitched properly and to avoid extra stress on the internal structure during flight.
Although LTA’s initial focus is on humanitarian applications, there are other areas where airships might shine one day. “An airship is kind of a ‘tweener,’ in between sea cargo and air freight,” says Taussig. Being fully electric, Pathfinder 1 is also greener than traditional air- or sea-freight options.
After completing Pathfinder 1’s construction late in 2022, LTA plans to conduct a series of ground tests on each of the airship’s systems in the first part of 2023. Once the team is satisfied with those tests, they’ll move to tethered flight tests and finally untethered flight tests over San Francisco’s South Bay later in the year.
The company will also construct an approximately 180-meter-long airship, Pathfinder 3 at its Akron Airdock facility in Ohio. Pathfinder 3 won’t be ready to fly in 2023, but its development shows LTA’s aspirations for an airship renaissance is more than just hot air.
This article appears in the January 2023 print issue as “The Return of the Airship.”
Top Tech 2023: A Special Report
Preview exciting technical developments for the coming year.
Can This Company Dominate Green Hydrogen?
Fortescue will need more electricity-generating capacity than France.
Pathfinder 1 could herald a new era for zeppelins
A New Way to Speed Up Computing
Blue microLEDs bring optical fiber to the processor.
The Personal-Use eVTOL Is (Almost) Here
Opener’s BlackFly is a pulp-fiction fever dream with wings.
Baidu Will Make an Autonomous EV
Its partnership with Geely aims at full self-driving mode.
China Builds New Breeder Reactors
The power plants could also make weapons-grade plutonium.
Economics Drives a Ray-Gun Resurgence
Lasers should be cheap enough to use against drones.
A Cryptocurrency for the Masses or a Universal ID?
What Worldcoin’s killer app will be is not yet clear.
The company’s Condor chip will boast more than 1,000 qubits.
Vagus-nerve stimulation promises to help treat autoimmune disorders.
New satellites can connect directly to your phone.
The E.U.’s first exascale supercomputer will be built in Germany.
A dozen more tech milestones to watch for in 2023.
A rocket built by Indian startup Skyroot has become the country’s first privately developed launch vehicle to reach space, following a successful maiden flight earlier today. The suborbital mission is a major milestone for India’s private space industry, say experts, though more needs to be done to nurture the fledgling sector.
The Vikram-S rocket, named after the founder of the Indian space program, Vikram Sarabhai, lifted off from the Indian Space Research Organization’s (ISRO) Satish Dhawan Space Centre, on India’s east coast, at 11:30 a.m. local time (1 a.m. eastern time). It reached a peak altitude of 89.5 kilometers (55.6 miles), crossing the 80-km line that NASA counts as the boundary of space, but falling just short of the 100 km recognized by the Fédération Aéronautique Internationale.
In the longer run, India’s space industry has ambitions of capturing a significant chunk of the global launch market.
Pawan Kumar Chandana, cofounder of the Hyderabad-based startup, says the success of the launch is a major victory for India’s nascent space industry, but the buildup to the mission was nerve-racking. “We were pretty confident on the vehicle, but, as you know, rockets are very notorious for failure,” he says. “Especially in the last 10 seconds of countdown, the heartbeat was racing up. But once the vehicle had crossed the launcher and then went into the stable trajectory, I think that was the moment of celebration.”
At just 6 meters (20 feet) long and weighing only around 550 kilograms (0.6 tonnes), the Vikram-S is not designed for commercial use. Today’s mission, called Prarambh, which means “the beginning” in Sanskrit, was designed to test key technologies that will be used to build the startup’s first orbital rocket, the Vikram I. The rocket will reportedly be capable of lofting as much as 480 kg up to an 500-km altitude and is slated for a maiden launch next October.
Skyroot cofounder Pawan Kumar Chandana standing in front of the Vikram-S rocket at the Satish Dhawan Space Centre, on the east coast of India.Skyroot
In particular, the mission has validated Skyroot’s decision to go with a novel all-carbon fiber structure to cut down on weight, says Chandana. It also allowed the company to test 3D-printed thrusters, which were used for spin stabilization in Vikram-S but will power the upper stages of its later rockets. Perhaps the most valuable lesson, though, says Chandana, was the complexity of interfacing Skyroot's vehicle with ISRO’s launch infrastructure. “You can manufacture the rocket, but launching it is a different ball game,” he says. “That was a great learning experience for us and will really help us accelerate our orbital vehicle.”
Skyroot is one of several Indian space startups looking to capitalize on recent efforts by the Indian government to liberalize its highly regulated space sector. Due to the dual-use nature of space technology, ISRO has historically had a government-sanctioned monopoly on most space activities, says Rajeswari Pillai Rajagopalan, director of the Centre for Security, Strategy and Technology at the Observer Research Foundation think tank, in New Delhi. While major Indian engineering players like Larsen & Toubro and Godrej Aerospace have long supplied ISRO with components and even entire space systems, the relationship has been one of a supplier and vendor, she says.
But in 2020, Finance Minister Nirmala Sitharaman announced a series of reforms to allow private players to build satellites and launch vehicles, carry out launches, and provide space-based services. The government also created the Indian National Space Promotion and Authorisation Centre (InSpace), a new agency designed to act as a link between ISRO and the private sector, and affirmed that private companies would be able to take advantage of ISRO’s facilities.
The first launch of a private rocket from an ISRO spaceport is a major milestone for the Indian space industry, says Rajagopalan. “This step itself is pretty crucial, and it’s encouraging to other companies who are looking at this with a lot of enthusiasm and excitement,” she says. But more needs to be done to realize the government’s promised reforms, she adds. The Space Activities Bill that is designed to enshrine the country’s space policy in legislation has been languishing in draft form for years, and without regulatory clarity, it’s hard for the private sector to justify significant investments. “These are big, bold statements, but these need to be translated into actual policy and regulatory mechanisms,” says Rajagopalan.
Skyroot’s launch undoubtedly signals the growing maturity of India’s space industry, says Saurabh Kapil, associate director in PwC’s space practice. “It’s a critical message to the Indian space ecosystem, that we can do it, we have the necessary skill set, we have those engineering capabilities, we have those manufacturing or industrialization capabilities,” he says.
The Vikram-S rocket blasting off from the Satish Dhawan Space Centre, on the east coast of India.Skyroot
However, crossing this technical milestone is only part of the challenge, he says. The industry also needs to demonstrate a clear market for the kind of launch vehicles that companies like Skyroot are building. While private players are showing interest in launching small satellites for applications like agriculture and infrastructure monitoring, he says, these companies will be able to build sustainable businesses only if they are allowed to compete for more lucrative government and defense-sector contacts.
In the longer run, though, India’s space industry has ambitions of capturing a significant chunk of the global launch market, says Kapil. ISRO has already developed a reputation for both reliability and low cost—its 2014 mission to Mars cost just US $74 million, one-ninth the cost of a NASA Mars mission launched the same week. That is likely to translate to India’s private space industry, too, thanks to a considerably lower cost of skilled labor, land, and materials compared with those of other spacefaring nations, says Kapil. “The optimism is definitely there that because we are low on cost and high on reliability, whoever wants to build and launch small satellites is largely going to come to India,” he says.
SEMrush and Ahrefs are among the most popular tools in the SEO industry. Both companies have been in business for years and have thousands of customers per month.
If you're a professional SEO or trying to do digital marketing on your own, at some point you'll likely consider using a tool to help with your efforts. Ahrefs and SEMrush are two names that will likely appear on your shortlist.
In this guide, I'm going to help you learn more about these SEO tools and how to choose the one that's best for your purposes.
What is SEMrush?
SEMrush is a popular SEO tool with a wide range of features—it's the leading competitor research service for online marketers. SEMrush's SEO Keyword Magic tool offers over 20 billion Google-approved keywords, which are constantly updated and it's the largest keyword database.
The program was developed in 2007 as SeoQuake is a small Firefox extension
Ahrefs is a leading SEO platform that offers a set of tools to grow your search traffic, research your competitors, and monitor your niche. The company was founded in 2010, and it has become a popular choice among SEO tools. Ahrefs has a keyword index of over 10.3 billion keywords and offers accurate and extensive backlink data updated every 15-30 minutes and it is the world's most extensive backlink index database.
Direct Comparisons: Ahrefs vs SEMrush
Now that you know a little more about each tool, let's take a look at how they compare. I'll analyze each tool to see how they differ in interfaces, keyword research resources, rank tracking, and competitor analysis.
Ahrefs and SEMrush both offer comprehensive information and quick metrics regarding your website's SEO performance. However, Ahrefs takes a bit more of a hands-on approach to getting your account fully set up, whereas SEMrush's simpler dashboard can give you access to the data you need quickly.
In this section, we provide a brief overview of the elements found on each dashboard and highlight the ease with which you can complete tasks.
The Ahrefs dashboard is less cluttered than that of SEMrush, and its primary menu is at the very top of the page, with a search bar designed only for entering URLs.
Additional features of the Ahrefs platform include:
When you log into the SEMrush Tool, you will find four main modules. These include information about your domains, organic keyword analysis, ad keyword, and site traffic.
You'll also find some other options like
Both Ahrefs and SEMrush have user-friendly dashboards, but Ahrefs is less cluttered and easier to navigate. On the other hand, SEMrush offers dozens of extra tools, including access to customer support resources.
When deciding on which dashboard to use, consider what you value in the user interface, and test out both.
If you're looking to track your website's search engine ranking, rank tracking features can help. You can also use them to monitor your competitors.
Let's take a look at Ahrefs vs. SEMrush to see which tool does a better job.
The Ahrefs Rank Tracker is simpler to use. Just type in the domain name and keywords you want to analyze, and it spits out a report showing you the search engine results page (SERP) ranking for each keyword you enter.
Rank Tracker looks at the ranking performance of keywords and compares them with the top rankings for those keywords. Ahrefs also offers:
You'll see metrics that help you understand your visibility, traffic, average position, and keyword difficulty.
It gives you an idea of whether a keyword would be profitable to target or not.
SEMRush offers a tool called Position Tracking. This tool is a project tool—you must set it up as a new project. Below are a few of the most popular features of the SEMrush Position Tracking tool:
All subscribers are given regular data updates and mobile search rankings upon subscribing
The platform provides opportunities to track several SERP features, including Local tracking.
Intuitive reports allow you to track statistics for the pages on your website, as well as the keywords used in those pages.
Identify pages that may be competing with each other using the Cannibalization report.
Ahrefs is a more user-friendly option. It takes seconds to enter a domain name and keywords. From there, you can quickly decide whether to proceed with that keyword or figure out how to rank better for other keywords.
SEMrush allows you to check your mobile rankings and ranking updates daily, which is something Ahrefs does not offer. SEMrush also offers social media rankings, a tool you won't find within the Ahrefs platform. Both are good which one do you like let me know in the comment.
Keyword research is closely related to rank tracking, but it's used for deciding which keywords you plan on using for future content rather than those you use now.
When it comes to SEO, keyword research is the most important thing to consider when comparing the two platforms.
The Ahrefs Keyword Explorer provides you with thousands of keyword ideas and filters search results based on the chosen search engine.
Ahrefs supports several features, including:
SEMrush's Keyword Magic Tool has over 20 billion keywords for Google. You can type in any keyword you want, and a list of suggested keywords will appear.
The Keyword Magic Tool also lets you to:
Both of these tools offer keyword research features and allow users to break down complicated tasks into something that can be understood by beginners and advanced users alike.
If you're interested in keyword suggestions, SEMrush appears to have more keyword suggestions than Ahrefs does. It also continues to add new features, like the Keyword Gap tool and SERP Questions recommendations.
Both platforms offer competitor analysis tools, eliminating the need to come up with keywords off the top of your head. Each tool is useful for finding keywords that will be useful for your competition so you know they will be valuable to you.
Ahrefs' domain comparison tool lets you compare up to five websites (your website and four competitors) side-by-side.it also shows you how your site is ranked against others with metrics such as backlinks, domain ratings, and more.
Use the Competing Domains section to see a list of your most direct competitors, and explore how many keywords matches your competitors have.
To find more information about your competitor, you can look at the Site Explorer and Content Explorer tools and type in their URL instead of yours.
SEMrush provides a variety of insights into your competitors' marketing tactics. The platform enables you to research your competitors effectively. It also offers several resources for competitor analysis including:
Traffic Analytics helps you identify where your audience comes from, how they engage with your site, what devices visitors use to view your site, and how your audiences overlap with other websites.
SEMrush's Organic Research examines your website's major competitors and shows their organic search rankings, keywords they are ranking for, and even if they are ranking for any (SERP) features and more.
The Market Explorer search field allows you to type in a domain and lists websites or articles similar to what you entered. Market Explorer also allows users to perform in-depth data analytics on These companies and markets.
SEMrush wins here because it has more tools dedicated to competitor analysis than Ahrefs. However, Ahrefs offers a lot of functionality in this area, too. It takes a combination of both tools to gain an advantage over your competition.
When it comes to keyword data research, you will become confused about which one to choose.
Consider choosing Ahrefs if you
Consider SEMrush if you:
Both tools are great. Choose the one which meets your requirements and if you have any experience using either Ahrefs or SEMrush let me know in the comment section which works well for you.
RSS Rabbit links users to publicly available RSS entries.
Vet every link before clicking! The creators accept no responsibility for the contents of these entries.
We're not prepared to take user feedback yet. Check back soon!