solvents
Green cooling tech is based on the idea of salt melting ice
By combining a salt and a solvent—both cheap and abundant—engineers developed a new refrigeration system that uses less energy and emits no greenhouse gases.
After natural disasters, workers rebuild — and face exploitation
In California, cars and cows hang heavy in the air
Puerto Ricans pump drinking water from hazardous-waste: report.
Some Puerto Rico residents are turning to a hazardous waste site for drinking water as the island continues to reel from Hurricane Maria.
Puerto Ricans pump drinking water from hazardous-waste: report
BY MAX GREENWOOD - 10/14/17 08:54 PM EDT
1,602
© Getty Images
Some Puerto Rico residents are turning to a hazardous waste site for drinking water as the island continues to reel from Hurricane Maria.
More than three weeks after Hurricane Maria tore across the island, many residents – U.S. citizens – remain without access to clean drinking water. As of Saturday evening, service had been restored to about 64 percent of the island.
But according to a CNN report, some residents are seeking water from potentially risky sources. That includes the Dorado Groundwater Contamination Site, an area designated by the Environmental Protection Agency (EPA) as a so-called Superfund site.
ADVERTISEMENT
Superfund sites are areas considered so badly contaminated that they are subject to special federal oversight and cleanup efforts. The Dorado site was added to the list in 2016.
On Friday, according to CNN, workers from Autoridad de Acueductos y Alcantarillados (AAA), the Puerto Rican water utility, pumped water from a well at the Dorado site, and distributed it to storm-stricken residents.
According to the EPA, groundwater at the Dorado site is "contaminated with organic based solvents, primarily tetrachloroethylene (PCE) and trichloroethylene (TCE), which are commonly used in commercial and industrial operations such as dry cleaning and metal degreasing.
Exposure to PCE and TCE carry the risk of health problems, including liver damage and an increased risk of cancer, according to the EPA.
Whether the specific well that workers are pumping from contains the chemicals is unknown. CNN reported that the EPA is testing the site over the weekend.
Luis Melendez, sub-director for environmental compliance at AAA, said that the water utility was not aware that they were drawing water from a Superfund site until CNN notified them. But he said that the well has been opened on an emergency basis, and that the water was safe to drink.
CNN also noted that the EPA had found the site to be within federal limits for PCE and chloroform in 2015.
TAGS ENVIRONMENTAL PROTECTION AGENCY PUERTO RICO HURRICANE MARIA
Fast flood insurance ‘fix’ may elude a broke – and broken – program.
Flood insurance, too risky for most insurers to touch, only exists because it is backed by a federal program that has been bleeding, and now is hemorrhaging, cash.
By Joseph A. Davis
Texas Climate News
Hurricanes Harvey, Irma, and Maria delivered massive blows to many U.S. home and business owners – and to the federal government and insurance industry that are expected to pay for hundreds of billions in damage.
But will they? Flood insurance, too risky for most insurers to touch, only exists because it is backed by a federal program that has been bleeding, and now is hemorrhaging, cash. The National Flood Insurance Program (NFIP) runs out Dec. 8, and most signs suggest a politically deadlocked Congress will – again – avoid the issues and punt.
Still, a real debate is likely to take place in coming months, and maybe even a rethinking of the ways the United States handles flood risk. Stay tuned.
Yes, it is a Texas thing. Gov. Greg Abbott estimated that Harvey damage could reach $180 billion (it could be less). And let’s please remember than only about one-fifth of the people suffering damage in Texas actually had flood insurance. The rest will endure loss or pay out of their own pockets, if they are not lucky enough to get FEMA grants. Being on the Gulf, where the waters are warmer, the Texas coast is prone to hurricanes, as Texans know all too well.
Yes, climate change has much to do with it. Climate change was not the simple cause of Harvey and Irma, but scientists say it made them worse and intensified their damage. More rainfall, higher wind strength, bigger storm surges, and rising seas are part of the equation – suggesting future risks will exceed historic risks.
But other factors have amplified flood damage much more. One is barely checked development and growth within areas at risk of flooding. Yet another is buildings that fail to survive or protect against flooding. Another is water-handling infrastructure that fails to protect against flooding, and in some case makes it worse. Still another is a set of attitudes and beliefs that encourage people to rebuild in flood-damaged areas. Those are all Texas things, too, some think.
The authorization for the NFIP was set to expire Sept. 30. But Congress folded a stop-gap three-month extension into the hurricane aid/debt ceiling/appropriations bill that President Trump signed Sept. 8. Three months may not be enough time. The stopgap fits a long pattern of short-term reauthorizations as Congress has bickered and dithered over many years. Most players agree the program is broken and needs fixing – but they disagree profoundly on the facts and how to fix it.
The House Financial Services Committee had been working on a fix, and approved a bill (H.R.2875) on June 21 by an encouraging 58-0 vote. It hasn’t reached the House floor. That happened before Harvey, Irma, and Maria made the whole situation worse – not least by wreaking tens or hundreds of billions of dollars in damage likely to put the NFIP much deeper in the hole.
A key fact about the NFIP is that when payouts on claims exceed revenues from premiums, the government borrows to cover the red ink. We have seen a flood of red ink since Katrina in 2005. Before the flags went up for Harvey, the NFIP was in debt to the Treasury to the tune of about $25 billion, with no prospect of paying it back. Now it’s going to be much worse.
The story of flood insurance in the U.S. goes back at least to the Great Mississippi Flood of 1927. After that, most private companies really wouldn’t touch flood insurance (or, at least, offer insurance at premiums people could afford). Congress created the modern NFIP in 1968.
The federal government itself is the insurer. The big catch is that it will not insure properties in communities that don’t have programs to keep development out of the flood plain or risky areas. NFIP is run by FEMA, the Federal Emergency Management Agency. Before they will issue mortgages, most banks require properties in risky areas to have flood insurance. There are about 5 million NFIP policies.
Not to be left out, private insurance companies are the ones who sell and service the federal NFIP insurance. One study says private companies get about one-third of the amount property owners pay in premiums.
One key goal of NFIP reform and reauthorization bills, arguably, would be to put the program back on sound financial footing. Would that mean forgiving the debt (as ranking House Democrat Maxine Waters of California has urged)? It will be like trying to bail out a sinking boat.
Some history to remember: The devastation of Superstorm Sandy in 2012 was in the tens of billions, and it prompted Congress to “reform” the NFIP with a 5-year extension (known as the Biggert-Waters Act) the same year. In an effort to make the program solvent, it raised premiums.
Those paying the premiums howled. Loudly. Not too much later, Congress reformed the program again, passing the Homeowner Flood Insurance Affordability Act (signed in March 2014), which lowered many premium increases – effectively un-reforming the program. The incident illustrates the political dynamics that make it hard to fix the NFIP.
Another issue plaguing the NFIP that lawmakers will find hard to settle: maps. Maps of the flood plain and other flood hazards are key to who gets insurance – and who needs it.. That’s why many homeowners and communities would rather not be included in the flood plain. This is even more true for people living in houses already built in the floodplain. To reduce covered risk and keep people out of the flood plain, the feds have an interest in mapping the flood plain fully and truly.
Climate is also an important factor here: recent events suggest that the so-called “100-year” flood has become far more frequent. There have also been real technical advances in mapping. But politics may win out over science. Especially climate science. Especially in the House of Representatives.
The NFIP’s ostensible goal of keeping people from building houses in the flood plain is subverted another way: disaster aid. Who could be against disaster aid? (Well, that’s actually a complex question, and actually another Texas thing.)
Congress’ first (and political) impulse after a disaster is to run in with money – as it showed with the September debt/aid bill. This history of federal disaster aid (not all conservatives approve) goes back at least to 1927 and Calvin Coolidge. In emergencies, FEMA provides disaster aid – essentially grants – to people whose houses have been ruined.
Most of the Harvey aid in the debt ceiling was for disaster aid – not the NFIP. Many homeowners whose stories we hear in the media reinforce a familiar narrative: the doughty homeowner determined to rebuild.
The NFIP insures a significant number of “repetitive loss” properties – something all the reforms have yet to stop. According to one study, some 30,000 properties in the program have flooded multiple times. One Houston-area home with flood insurance has flooded 22 times since 1979, the Wall Street Journal reported in an examination of the issue this month. The Pew Charitable Trusts, a nonprofit think tank, says repetitive loss properties cost the NFIP billions of dollars.
Any legislative fix to the NFIP will be politically and technically difficult. The NFIP and the issues surrounding it are extremely complex. The House committee bill, which is pretty much the only ball on the field right now, is actually a package of some six or more separate bills. The magnitude of the Harvey, Irma, and Maria disasters could give Congress some impetus to act. But a real fix might require many congressional Republicans to overcome their denial of climate-change science.
There really are efforts to find a bipartisan solution. But the House committee bill, which got a unanimous vote, faces a serious block of Republican opposition on the way to the floor. There is also a Senate bill waiting on the bench, which is quite different from the House bill.
And the key issue of whether NFIP premiums get higher or stay where they are will be a difficult test. Ideology will be a big factor, too. As the bills go forward, there will be a lot of talk about letting the marketplace fix the problems. Keep an eye out for efforts to fix the marketplace.
+++++
Joseph A. Davis, a veteran journalist covering environmental and energy issues in the nation’s capital, is the Washington correspondent of Texas Climate News.
Thirty years after Montreal pact, solving the ozone problem remains elusive.
Did the Montreal Protocol fix the ozone hole? It seemed so. With chlorofluorocarbons (CFCs) and other ozone-eating chemicals banned, many scientists said it was only a matter of time before the ozone layer recharged, and the annual hole over Antarctica healed for good.
ANALYSIS
Thirty Years After Montreal Pact, Solving the Ozone Problem Remains Elusive
Despite a ban on chemicals like chlorofluorocarbons, the ozone hole over Antarctica remains nearly as large as it did when the Montreal Protocol was signed in 1987. Scientists now warn of new threats to the ozone layer, including widespread use of ozone-eating chemicals not covered by the treaty.
BY FRED PEARCE • AUGUST 14, 2017
Did the Montreal Protocol fix the ozone hole? It seemed so. With chlorofluorocarbons (CFCs) and other ozone-eating chemicals banned, many scientists said it was only a matter of time before the ozone layer recharged, and the annual hole over Antarctica healed for good.
But 30 years on, some atmospheric chemists are not so sure. The healing is proving painfully slow. And new discoveries about chemicals not covered by the protocol are raising fears that full recovery could be postponed into the 22nd century – or possibly even prevented altogether.
In mid-September, the United Nations is celebrating the protocol’s 30th anniversary. It will declare that “we are all ozone heroes.” But are we patting ourselves on the back a bit too soon?
The ozone layer is a long-standing natural feature of the stratosphere, the part of the atmosphere that begins about six miles above the earth. The ozone layer filters out dangerous ultraviolet radiation from the sun that can cause skin cancer and damage many life forms. It may have been essential for the development of life on Earth.
So there was alarm in the 1970s when researchers first warned that extremely stable man-made compounds like CFCs, used in refrigerants and aerosols, were floating up into the stratosphere, where they released chlorine and bromine atoms that break down ozone molecules. In the 1980s, Antarctic researchers discovered that these chemical reactions went into overdrive in the super-cold polar stratospheric clouds that formed over the frozen continent. They had begun creating a dramatic “hole” in the ozone layer at the end of each austral winter.
The ensuing panic resulted in the signing of the Montreal Protocol on September 16, 1987. It and its successors have phased out production of a range of man-made chlorine and bromine compounds thought to persist for the several years needed for them to reach the stratosphere. Besides CFCs, they include carbon tetrachloride, hydrochlorofluorocarbons (HCFCs), and methyl bromide, a fumigant once widely used to kill pests.
So far so good. The amount of ozone-depleters in the atmosphere has dropped by more than 10 percent since peaking in the late 1990s. In response, the total ozone in the atmosphere has been largely unchanged since 2000.
Satellite imagery depicting the annual maximum extent of the ozone hole over Antarctica from 1979 to 2013. Credit: NASA GODDARD SPACE FLIGHT CENTER
But in the past five years, evidence has emerged that potential ozone-eating compounds can reach the ozone layer much faster than previously thought. Under some weather conditions, just a few days may be enough. And that means a wide range of much more short-lived compounds threaten the ozone layer – chemicals not covered by the Montreal Protocol.
These compounds are all around us. They are widely used as industrial solvents for tasks like degreasing and dry cleaning. And their releases into the atmosphere are increasing fast.
These new ozone-busters include dichloromethane (DCM), a common and cheap paint stripper, also used in foam-blowing agents and, ironically, in the manufacture of “ozone-friendly” alternatives to CFCs. With emissions now exceeding one million tons a year, the concentration of DCM in the lower atmosphere has more than doubled since 2004. Even so, it has not been regarded as a threat to the ozone layer, because its typical lifetime in the atmosphere before it is broken down in photochemical reactions is only about five months. It should, atmospheric chemists concluded, remain safely in the lower atmosphere.
But that view collapsed in 2015, when Emma Leedham Elvidge at the University of East Anglia in England examined air samples taken on board commercial aircraft cruising at the lower edge of the stratosphere. She found high levels of DCM, especially over the Indian subcontinent and Southeast Asia, and particularly during the Asian monsoon season, when strong updrafts fast-track air from the ground to the stratosphere. It seems they were taking DCM along for the ride.
Alarm bells are ringing about dozens of other short-lived ozone-destroying chlorine compounds accumulating in the atmosphere.
How much should we worry? Ryan Hossaini, an atmospheric chemist at Lancaster University, recently did the math. He calculated that DCM currently contributes less than 10 percent of the chlorine in the ozone layer. But on current emission trends, it could be That could delay the ozone hole’s recovery by 30 years, until at least 2095, he suggested.
Others share that concern. “Growing quantities of DCM are leaking into the stratosphere, where it is exceptionally effective in destroying the ozone,” says David Rowley, an atmospheric chemist at the University College London, who was not involved in the research. “The potential for DCM to affect the global ozone budget is profound.”
Alarm bells are ringing about dozens of other short-lived, potentially ozone-destroying chlorine compounds accumulating in the atmosphere as a result of fast-rising global manufacturing. They include 1,2-dichloroethane, a chemical widely used in the manufacture of PVC pipes. There are few atmospheric measurements of this compound yet, “but sporadic data suggest it is a significant source of chlorine in the atmosphere,” says Hossaini.
The risks of such chemicals reaching the ozone layer are greatest in the tropics, where manufacturing is booming in fast-industrialising countries such as China and India, and where, as luck would have it, atmospheric circulation patterns are favorable. The Asian monsoon can propel the gases to the stratosphere in as little as ten days, according to unpublished research seen by Yale Environment 360.
The movement of ozone-depleting chemicals through the atmosphere, shifting from the tropics and concentrating in Antarctica. NASA GODDARD SPACE FLIGHT CENTER
Thirty years on, the Montreal Protocol has not begun to come to grips with these chemicals, warns Rowley. “The naïve view until recently,” he says, “was that short-lived [chemicals] didn’t present a threat to stratospheric ozone. Wrong.”
Other loopholes in the protocol are concerning researchers as well. In 2014, colleagues of Leedham Elvidge’s at the University of East Anglia warned that three CFCs supposedly banned under the protocol were turning up in increasing amounts in the clean air blowing round the Southern Ocean and captured at Cape Grim in Tasmania. Johannes Laube, an atmospheric chemist at the University of East Anglia, calculated that global emissions of CFC-113a, once an important feedstock in manufacturing both refrigerants and pyrethroid pesticides, doubled in two years.
How come? It turns out that the Montreal Protocol never completely banned CFCs. “CFC-113a is covered by a loophole that allows industries to apply for exemptions,” Laube says. Confidentiality clauses in the treaty about these exemptions mean that “we simply don’t know if we have found exempted emissions, or if they are from some illegal manufacture somewhere. Either way, they are increasing fast, which makes this worrying.” Trade in banned ozone-depleting chemicals has declined in the past decade, but remains a problem, and has been documented particularly for hydrochlorofluorocarbons.
Scientists knew recovery of the ozone layer would take time because of the long lifetimes of many of the dangerous compounds we unleashed in past decades. But last year, Susan Solomon of MIT – who back in the 1980s became one of the world’s most celebrated scientists for uncovering the chemistry of the polar stratospheric clouds — declared that she had detected the first “fingerprints” of the hole closing. “The onset of healing of Antarctic ozone loss has now emerged,” she wrote.
“The signature of ozone recovery is not quite there yet,” says one expert.
But other researchers remain cautious. There have been some recent bumper springtime holes in Antarctic ozone. The 2015 hole was the fourth largest since 1991, peaking at an area larger than the continent of North America. It was also deeper than other recent holes and lasted longer. 2016 was also worse than average and 2017 is expected to be severe, too.
Solomon blamed 2015 on the Calbuco volcano in Chile, which ejected sulphur particles that enhanced the ozone-destroying properties of polar stratospheric clouds. But Susan Strahan of NASA’s Goddard Space Flight Center warns that the size of the hole in any given year is still dominated by year-to-year variations in the temperature of the stratosphere and the vagaries of meteorology. “The signature of ozone recovery is not quite there yet,” she says, adding that day will come, but we may have to wait until the 2030s.
Meanwhile at the other end of the planet, ozone losses over the Arctic may still be worsening. The Arctic is less susceptible to the formation of ozone holes than Antarctica, because the weather is messier. The stable air that causes the ultra-cold conditions where polar stratospheric clouds form in Antarctica is much less likely. But it does happen whenever temperatures get cold enough for polar stratospheric clouds to form.
A deep hole briefly formed over the Arctic in 2011. In places, more than 80 percent of the ozone was destroyed, twice the loss in the worst previous years, 1996 and 2005. In both the past two winters, researchers saw polar stratospheric clouds over parts of Britain, says Jonathan Shanklin of the British Antarctic Survey. But they were brief and did not lead to major ozone loss.
Shanklin says an important reason for the sluggish recovery of the ozone layer is global warming. As increased levels of greenhouse gases such as carbon dioxide trap more solar heat radiating from the Earth’s surface, less warmth reaches the stratosphere, which cools as a result. This trend has been evident for almost 40 years. A colder stratosphere improves conditions for ozone loss. Climate change “could delay the recovery of the ozone hole well into the second half of this century,” he says.
Protecting the ozone layer “presents a much greater industrial and political challenge than previously thought,” says one researcher.
Should we be frightened? Some of the crazier hype in the early days of the ozone hole – like blind sheep in Patagonia and collapsing marine ecosystems – proved nonsense. But the raised risk of skin cancers from the extra ultraviolet radiation streaming through the thinned ozone layer is real enough – particularly for reckless white-skinned sunbathers. The ozone layer is still as thin as it was 30 years ago.
The good news is that without the Montreal Protocol things would have been a great deal worse, says Martyn Chipperfield, an atmospheric chemist at the University of Leeds. The Antarctic hole would be 40 percent bigger than it is; the ozone layer over Europe and North America would be 10 percent thinner; the 2011 Arctic hole would have been Antarctic-sized; and we would be looking at about two million more cases of skin cancers by 2030, according to research conducted by Chipperfield and colleagues.
Even so, the idea that the Montreal Protocol is doing its job and the recovery is under way begins to look complacent. If emissions of uncontrolled ozone-depleting chemicals such as DCM continue rising, then the gains could be lost. The answer is obvious. “We should be looking into controlling DCM and other solvents, much in the same way as we did CFCs,” says Leedham Elvidge.
The World Meteorological Organization and other UN agencies overseeing the protocol acknowledge that DCM and other short-lived ozone depleting substances “are an emerging issue for stratospheric ozone,” but the government signatories have yet to take action to limit their emissions.
That would involve getting rid of a far wider range of chemicals than so far done under the protocol. Protecting the ozone layer “presents a much greater industrial and political challenge than previously thought,” says Rowley. Thirty years on, there is evidently still a lot to do.
Fred Pearce is a freelance author and journalist based in the U.K. He is a contributing writer for Yale Environment 360 and is the author of numerous books, including "The Land Grabbers, Earth Then and Now: Potent Visual Evidence of Our Changing World," and "The Climate Files: The Battle for the Truth About Global Warming." MORE ABOUT FRED PEARCE →
TOPICS
Climate
PUBLIC HEALTH
POLLUTION
ENVIRONMENTAL LAW
REGIONS
Antarctica and the Arctic
Join the conversation: Thirty Years After Montreal Pact, Solving the Ozone Problem Remains Elusive
Show comments →
Never miss a feature! Sign up for the E360 Newsletter →
RELATED ARTICLES
OPINION
Taking the Long View: The ‘Forever Legacy’ of Climate Change
By ROB WILDER and DAN KAMMEN
INTERVIEW
As Harvey’s Floodwaters Recede, How Should Houston Rebuild?
By DIANE TOOMEY
The World Eyes Yet Another Unconventional Source of Fossil Fuels
By NICOLA JONES
MORE FROM E360
FORESTS
In Africa’s Oldest Park, Seeking Solutions to a Destructive Charcoal Trade
OPINION
Taking the Long View: The ‘Forever Legacy’ of Climate Change
BIODIVERSITY
Unnatural Surveillance: How Online Data Is Putting Species at Risk
INTERVIEW
As Harvey’s Floodwaters Recede, How Should Houston Rebuild?
ENERGY
Utilities Grapple with Rooftop Solar and the New Energy Landscape
ANALYSIS
Trump’s Judges: A Second Front in the Environmental Rollback
INTERVIEW
How Listening to Trees Can Help Reveal Nature’s Connections
ENERGY
The World Eyes Yet Another Unconventional Source of Fossil Fuels
INTERVIEW
Investigating the Enigma of Clouds and Climate Change
ANALYSIS
Thirty Years After Montreal Pact, Solving the Ozone Problem Remains Elusive
CONSERVATION
In a Rare U.S. Preserve, Water Pressures Mount As Development Closes In
ANALYSIS
The Nitrogen Problem: Why Global Warming Is Making It Worse
E360
About E360
Reprints
Contact
Support E360
Privacy Policy
Submission Guidelines
Newsletter
Published at the
Yale School of Forestry and Environmental Studies
Could papayas help Hawaii become energy independent?
A plant pathologist and her team of researchers have been gathering leftover papayas from local packing houses and turning them into a fuel that’s used to produce biodiesel.
BY SARA NOVAK | SEP 12 2017
Papayas are big business in Hawaii. In 2016, the islands produced nearly 20 million pounds of the tropical melon valued at an estimated $10 million. The Hawaiian papaya is also highly controversial. After the papaya ringspot virus decimated the island’s crop three decades ago, much of the fruit grown there today has been genetically modified to be resistant.
For Hawaiian farmers, selling the papayas can be difficult. Countries are often reticent to import genetically modified crops. They also face an uphill battle because of the high cost of imported fertilizers. But even more problematic is waste. Approximately a third of the Hawaiian papaya crop is discarded because it’s bruised or misshapen. Farmers throw out these otherwise saleable crops when margins are already thin.
Lisa Keith, a plant pathologist with the USDA in Hilo, Hawaii, may have a solution. Keith and her team of researchers have been gathering leftover papayas from local packing houses and turning them into a fuel that’s used to produce biodiesel. It’s a surprisingly simple process that includes adding algae to large tanks that contain a sterile pureed papaya solution, where a process called heterotrophic growth takes place—in the absence of sunlight, algae feed on the sugar in the papaya. When the algae becomes starved of nitrogen after depleting the nutrients in the puree, it stimulates lipid production, which causes the lipid cells to balloon up with oil in just under two weeks’ time. These oils can be used for biodiesel production after the glycerol present in the cells is extracted.
The entire process takes less than a month. It’s an entirely waste-free endeavor because, says Keith, the leftover algae can be added to fish meal and the glycerol can be added to feedstock. This makes the algae/papaya combination a practical solution to another major island issue. In Hawaii, almost 90 percent of the island’s energy is imported. Producing local, sustainable energy through solar, wind, and other methods like biodiesel makes economic sense for the islands, which have set a 2045 deadline for producing 100 percent of their energy from sustainable sources. What’s more, biodiesel production could also be another reliable stream of income for farmers struggling to survive in a difficult market.
The main obstacle that Keith and her team have encountered in their attempt to scale the energy source has been oil extraction. “Removing the oil from the algae cells is still a challenge,” says Keith, and they’ve yet to come up with a means for extracting oils from algae cells at an economically sustainable rate. But, she says, “This is an ongoing problem with the industry as a whole.”
Currently, there are two main methods for oil extraction: mechanical and chemical. Often they’re combined depending on the type of algae used in production. Mechanical involves physically crushing the cells similar to crushing olives or soybeans to extract the oil. The other involves using chemical solvents to disrupt the cells.
In the lab, Keith and her team are using a combined method of mechanical disruption and the use of solvents. Algae are added to test tubes and the samples are mixed at high speeds to break open the cells. The oil is then extracted by adding hexane, a chemical solvent often used in food processing. The hexane evaporates, leaving behind the algae oil.
Even though the project isn’t yet fully scalable, Keith says that the research is promising. She’s currently working with the Hawaii Department of Agriculture’s Agribusiness Development Corporation and Bib Island Biodiesel to optimize the complicated process so that it can become a viable energy source on the island. Production has increased much faster than she thought it would.
In the future, her team plans on researching other Hawaiian crops that can be used in biodiesel production like Okinawan sweet potato, guava, banana, and cacao pulp.
While algae biodiesel production isn’t going to make Hawaii energy independent in the short term, it’s a big step in the right direction—a win for farmers looking for another viable stream of income, and a win for those sick of the high cost of unsustainable energy on this isolated archipelago.