object(WP_Query)#4838 (56) { ["query"]=> array(1) { ["custom_tags"]=> string(8) "research" } ["query_vars"]=> array(71) { ["custom_tags"]=> string(8) "research" ["error"]=> string(0) "" ["m"]=> string(0) "" ["p"]=> int(0) ["post_parent"]=> string(0) "" ["subpost"]=> string(0) "" ["subpost_id"]=> string(0) "" ["attachment"]=> string(0) "" ["attachment_id"]=> int(0) ["name"]=> string(0) "" ["pagename"]=> string(0) "" ["page_id"]=> int(0) ["second"]=> string(0) "" ["minute"]=> string(0) "" ["hour"]=> string(0) "" ["day"]=> int(0) ["monthnum"]=> int(0) ["year"]=> int(0) ["w"]=> int(0) ["category_name"]=> string(0) "" ["tag"]=> string(0) "" ["cat"]=> string(0) "" ["tag_id"]=> string(0) "" ["author"]=> string(0) "" ["author_name"]=> string(0) "" ["feed"]=> string(0) "" ["tb"]=> string(0) "" ["paged"]=> int(0) ["meta_key"]=> string(0) "" ["meta_value"]=> string(0) "" ["preview"]=> string(0) "" ["s"]=> string(0) "" ["sentence"]=> string(0) "" ["title"]=> string(0) "" ["fields"]=> string(3) "all" ["menu_order"]=> string(0) "" ["embed"]=> string(0) "" ["category__in"]=> array(0) { } ["category__not_in"]=> array(0) { } ["category__and"]=> array(0) { } ["post__in"]=> array(0) { } ["post__not_in"]=> array(0) { } ["post_name__in"]=> array(0) { } ["tag__in"]=> array(0) { } ["tag__not_in"]=> array(0) { } ["tag__and"]=> array(0) { } ["tag_slug__in"]=> array(0) { } ["tag_slug__and"]=> array(0) { } ["post_parent__in"]=> array(0) { } ["post_parent__not_in"]=> array(0) { } ["author__in"]=> array(0) { } ["author__not_in"]=> array(0) { } ["search_columns"]=> array(0) { } ["meta_query"]=> array(0) { } ["parsed_rml_folder"]=> int(0) ["ignore_sticky_posts"]=> bool(false) ["suppress_filters"]=> bool(false) ["cache_results"]=> bool(true) ["update_post_term_cache"]=> bool(true) ["update_menu_item_cache"]=> bool(false) ["lazy_load_term_meta"]=> bool(true) ["update_post_meta_cache"]=> bool(true) ["post_type"]=> string(0) "" ["posts_per_page"]=> int(9) ["nopaging"]=> bool(false) ["comments_per_page"]=> string(2) "50" ["no_found_rows"]=> bool(false) ["taxonomy"]=> string(11) "custom_tags" ["term"]=> string(8) "research" ["order"]=> string(4) "DESC" ["use_rml_folder"]=> bool(false) } ["tax_query"]=> object(WP_Tax_Query)#6685 (6) { ["queries"]=> array(1) { [0]=> array(5) { ["taxonomy"]=> string(11) "custom_tags" ["terms"]=> array(1) { [0]=> string(8) "research" } ["field"]=> string(4) "slug" ["operator"]=> string(2) "IN" ["include_children"]=> bool(true) } } ["relation"]=> string(3) "AND" ["table_aliases":protected]=> array(1) { [0]=> string(24) "x_rst_term_relationships" } ["queried_terms"]=> array(1) { ["custom_tags"]=> array(2) { ["terms"]=> array(1) { [0]=> string(8) "research" } ["field"]=> string(4) "slug" } } ["primary_table"]=> string(11) "x_rst_posts" ["primary_id_column"]=> string(2) "ID" } ["meta_query"]=> object(WP_Meta_Query)#6708 (9) { ["queries"]=> array(0) { } ["relation"]=> NULL ["meta_table"]=> NULL ["meta_id_column"]=> NULL ["primary_table"]=> NULL ["primary_id_column"]=> NULL ["table_aliases":protected]=> array(0) { } ["clauses":protected]=> array(0) { } ["has_or_relation":protected]=> bool(false) } ["date_query"]=> bool(false) ["queried_object"]=> object(WP_Term)#6801 (11) { ["term_id"]=> int(14444) ["name"]=> string(8) "research" ["slug"]=> string(8) "research" ["term_group"]=> int(0) ["term_taxonomy_id"]=> int(14444) ["taxonomy"]=> string(11) "custom_tags" ["description"]=> string(0) "" ["parent"]=> int(0) ["count"]=> int(7) ["filter"]=> string(3) "raw" ["term_order"]=> string(1) "0" } ["queried_object_id"]=> int(14444) ["request"]=> string(1164) "SELECT SQL_CALC_FOUND_ROWS x_rst_posts.ID FROM x_rst_posts LEFT JOIN x_rst_term_relationships ON (x_rst_posts.ID = x_rst_term_relationships.object_id) LEFT JOIN x_rst_icl_translations wpml_translations ON x_rst_posts.ID = wpml_translations.element_id AND wpml_translations.element_type = CONCAT('post_', x_rst_posts.post_type) WHERE 1=1 AND ( x_rst_term_relationships.term_taxonomy_id IN (14444) ) AND ((x_rst_posts.post_type = 'page' AND (x_rst_posts.post_status = 'publish' OR x_rst_posts.post_status = 'acf-disabled')) OR (x_rst_posts.post_type = 'post' AND (x_rst_posts.post_status = 'publish' OR x_rst_posts.post_status = 'acf-disabled'))) AND ( ( ( wpml_translations.language_code = 'en' OR 0 ) AND x_rst_posts.post_type IN ('post','page','attachment','wp_block','wp_template','wp_template_part','wp_navigation','popup','popup_theme','cookielawinfo','team' ) ) OR x_rst_posts.post_type NOT IN ('post','page','attachment','wp_block','wp_template','wp_template_part','wp_navigation','popup','popup_theme','cookielawinfo','team' ) ) GROUP BY x_rst_posts.ID ORDER BY x_rst_posts.post_date DESC LIMIT 0, 9" ["posts"]=> &array(7) { [0]=> object(WP_Post)#6712 (24) { ["ID"]=> int(115927) ["post_author"]=> string(5) "21506" ["post_date"]=> string(19) "2025-03-19 07:00:00" ["post_date_gmt"]=> string(19) "2025-03-19 05:00:00" ["post_content"]=> string(9770) "

Planes? Nope. Agriculture? Try again. It’s actually our buildings that are among the biggest culprits of energy consumption and emissions. The numbers are truly staggering. The latest figures from the International Energy Agency reveal that the operations of buildings alone account for 30 percent of global final energy consumption. The direct CO2 emissions from buildings hit three gigatonnes in 2022. To put that in perspective, the aviation industry contributed almost 800 megatonnes in the same year—73 percent less than building emissions. 

Think your thermostat is smart? ecobee is getting smarter

Over 90 percent of the energy used in residential buildings is devoted to space heating and hot water. Making existing buildings more energy-efficient is the obvious solution. Indeed, to achieve a climate-neutral Europe by 2050, the EU has pledged to shift from "nearly-zero energy buildings" to "zero-emission buildings" for new construction starting in 2028.

On a consumer level, demand for greener and more efficient homes is also gaining traction. Devices such as smart thermostats—Wi-Fi-enabled devices that learn your household routines and optimise heating and cooling accordingly—have been hitting the shelves with breakneck speed in recent years. Smart thermostats are big business; the market is projected to reach US$5.15bn in 2025.

One such manufacturer is ecobee. Aside from launching the world’s first smart thermostat in 2009, ecobee has also made a significant impact on reducing energy and e-waste. They ensure their products last and “design for easy disassembly and repair and resell thousands of thermostats per year.” According to their own data, customers across North America have saved over 41.2 terawatt-hours (TWh) of energy with ecobee’s smart thermostats—the equivalent of taking all the homes in New York City off the grid for an entire year. 

The company is now changing tack. Leveraging the popularity and proliferation of smart thermostats across North America, they’re now developing an entirely new way to tackle our buildings’ planetary burden: data donation.

Energy and money savings vary depending on the settings used, and average savings are complicated to elicit. However, the average American home is said to save approximately 8 percent on its annual heating and cooling bills with a smart thermostat, and UK customers save between 8.4 and 16.5 percent of their heating energy use. Some estimates cite savings as high as 31 percent

Powering science with user data

ecobee’s data donation project allows users to help researchers analyse energy consumption patterns on an unprecedented scale. Currently, those looking to understand and optimise home energy usage only have access to data from a handful of homes. ecobee hopes they can boost this number to over 200,000 with the program.

The data, which is stripped of any personally identifiable information, includes insights such as home size, temperature settings, occupancy schedules and Heating, Ventilation and Air Conditioning (HVAC) runtimes. Researchers at universities, government agencies and NGOs then use this information to develop better energy-saving strategies. The hope is that, once the findings are shared with them, these researchers will then publicly publish their findings to ensure the data continues to provide use. Ecobee receives no financial compensation for this data. ecobee declined to comment for this article.

“You can’t manage what you can’t measure”

While smart devices and their data optimise energy use, they’re far from a silver bullet in the fight against climate change. Issues surrounding structural inefficiencies of poorly insulated buildings, outdated heating systems or urban planning prioritising carbon-intensive materials remain unaddressed. These issues represent the lion’s share of our building’s emissions. Even if every human being on the planet donated their data, there’s simply only so “efficient” our homes can be. Smart tech can certainly help trim energy waste. But, deep emissions cuts will require larger-scale policy shifts and infrastructure investments, including a sizeable reduction in our reliance on non-renewable energy sources. 

However, by contributing real-world insights at scale, devices like ecobee’s could well be shifting the conversation from isolated efficiency gains to systemic change. This change couldn’t come sooner. With an ever-increasing number of people living in urban environments and the risk of missing our climate targets becoming likelier than ever, we need all the data we can get to make our homes as sustainable as possible. With every adjusted degree and every shared data point, we’re not just warming or cooling our homes more intelligently—we’re rewriting the blueprint for how buildings consume resources.

" ["post_title"]=> string(82) "Your Smart Thermostat Data Could Tackle One of Earth’s Biggest Emissions Sources" ["post_excerpt"]=> string(145) "Data privacy is on everyone's lips. But, could your smart thermostat data help to reduce emissions from the building sector's enormous footprint?" ["post_status"]=> string(7) "publish" ["comment_status"]=> string(6) "closed" ["ping_status"]=> string(6) "closed" ["post_password"]=> string(0) "" ["post_name"]=> string(79) "your-smart-thermostat-data-could-tackle-one-of-earths-biggest-emissions-sources" ["to_ping"]=> string(0) "" ["pinged"]=> string(0) "" ["post_modified"]=> string(19) "2025-05-21 12:23:31" ["post_modified_gmt"]=> string(19) "2025-05-21 10:23:31" ["post_content_filtered"]=> string(0) "" ["post_parent"]=> int(0) ["guid"]=> string(27) "https://reset.org/?p=115927" ["menu_order"]=> int(0) ["post_type"]=> string(4) "post" ["post_mime_type"]=> string(0) "" ["comment_count"]=> string(1) "0" ["filter"]=> string(3) "raw" } [1]=> object(WP_Post)#6686 (24) { ["ID"]=> int(108329) ["post_author"]=> string(5) "21506" ["post_date"]=> string(19) "2023-06-21 07:13:00" ["post_date_gmt"]=> string(19) "2023-06-21 05:13:00" ["post_content"]=> string(6523) "

The importance of fundamental research

It is now widely known that the oceans are severely threatened by numerous human activities. Environmental disasters, dumping and overfishing are just some examples. This leads to species threatened with extinction, destroyed environments and the threat to the oceans as a source of human food and livelihood.

Poorly managed work in the field of nature conservation is often ineffective and, worst case, leads to additional damage. The best way to ensure environmental protection is to thoroughly collect 'basic data' — the fundamental observational data needed to understand a topic — in advance, evaluate it professionally and develop successful long-term conservation projects based on its findings. Afterwards, the results can be compared with previous data, meaning the success of the conservation project can be statistically evaluated and, if necessary, adjustments can be made.

In addition, many key observations about nature including data on cycles, interactions and interrelationships, as well as complex problems and the measures taken against them, are often only reflected accurately in data after a longer observation period. Basic research is therefore usually designed for the long term.

Rügen's data buoy is deployed

The Federal Agency for Nature Conservation has developed a floating measuring station with various sensors for marine research at about four metres high and weighing 900 kilograms. The so-called data buoy has the task of recording hydrological and meteorological data such as temperature, turbidity and oxygen and salinity levels, as well as documenting the occurrence of harbour porpoises and bats. Data on shipping traffic and noise pollution within the protected area are also collected. According to Corinna Bertz from the Federal Agency for Nature Conservation, further sensors are also being developed. In the future, for example, it should be possible to measure CO2 and methane concentrations and "contribute to assessing the state of the oceans as a result of climate change and the possible contribution of the oceans to the storage of greenhouse gases such as CO2 and methane".

Part of the information is transmitted from the data buoy in real-time and without interruption via satellite and GSM connection to the Agency. "The results — as long as they are not subject to secrecy — are available to all research institutions and the public", explains Corinna Bertz. The first findings are expected to be available in autumn 2023. More data bins will follow in other marine protected areas.

Strengthening marine protected areas through basic data

The recorded data provides essential information about the state of the environment. In the future, biological and oceanographic monitoring will serve as a scientific basis for sustainable area management. To this end, the collected data will be incorporated into nature conservation and political decisions, playing a decisive role in determining the type of conservation measures. This is to ensure that the protected areas are safeguarded in the long term.

"Marine protected areas are important instruments for the protection of the oceans. It is important to strengthen management in the existing protected areas. The new measuring stations will provide the necessary environmental data for this"

says the Federal Government's Commissioner for the Marine Environment, Sebastian Unger, about the project.

In addition, the open data approach of the project enables an interdisciplinary exchange with other research institutions. The public is also involved in the project, which should lead to more acceptance and increasing interest in marine conservation.

As individual information on an area is documented by the data bin, tailor-made recommendations for action can be derived from it for problems occurring on-site. This increases the conservation project's chances of success. For meaningful and successful long-term marine protection, sound basic research with a lot of data is needed — observe first, take targeted action after is the motto!

" ["post_title"]=> string(67) "Starting From Scratch: 'Basic Data' Buoy Bolsters Marine Protection" ["post_excerpt"]=> string(145) "Successful marine protection needs a strong scientific basis of sound data. A floating measuring station in the Baltic Sea aims to provide this. " ["post_status"]=> string(7) "publish" ["comment_status"]=> string(6) "closed" ["ping_status"]=> string(6) "closed" ["post_password"]=> string(0) "" ["post_name"]=> string(64) "starting-from-scratch-basic-data-buoy-bolsters-marine-protection" ["to_ping"]=> string(0) "" ["pinged"]=> string(0) "" ["post_modified"]=> string(19) "2023-06-21 10:39:38" ["post_modified_gmt"]=> string(19) "2023-06-21 08:39:38" ["post_content_filtered"]=> string(0) "" ["post_parent"]=> int(0) ["guid"]=> string(27) "https://reset.org/?p=108329" ["menu_order"]=> int(0) ["post_type"]=> string(4) "post" ["post_mime_type"]=> string(0) "" ["comment_count"]=> string(1) "0" ["filter"]=> string(3) "raw" } [2]=> object(WP_Post)#6687 (24) { ["ID"]=> int(104614) ["post_author"]=> string(3) "390" ["post_date"]=> string(19) "2022-09-14 08:00:00" ["post_date_gmt"]=> string(19) "2022-09-14 06:00:00" ["post_content"]=> string(14616) "

Nuclear fusion may seem the stuff of sci-fi, but according to recent announcements it may soon become a real possibility. An experimental fusion reactor in South Korea was able to sustain fusion operation for 20 seconds - a notable achievement considering the infancy of the field.

In actual fact, the achievement at the Korea Superconducting Tokamak Advanced Research (KSTAR) device was achieved in 2020, but only now has the claim been fully peer-reviewed and published in the Nature journal. The tokamak - a donut shaped device which controls thermonuclear fusion - was able to reach ion temperatures of 100 million degrees Celsius, around seven times hotter than our sun. Prior to this breakthrough, the KSTAR had only maintained 100 million degrees for 10 seconds.

The KSTAR uses powerful magnetic fields to create and stabilise an ultra-hot plasma, the heat of which can then be used to generate electricity. Unlike in common nuclear fission, which separates atomic nuclei via a chain reaction, nuclear fusion merges nuclei together, generating huge amounts of energy. In theory, if the hardware is up to the task, nuclear fusion could replicate the processes of our sun and provide near limitless power with only basic raw materials and zero emissions. Most fusion reactors use hydrogen isotopes, often derived from seawater, as the basis of the process.

https://www.youtube.com/watch?v=DI2pTdyUbLo

The major issue for researchers in regards to fusion power is two fold. Firstly, current experiments consume more power than they produce, and key to overcoming this is maintaining high temperatures for longer. However, the second issue is the breakdown of mechanical parts. When dealing with huge temperatures, equipment can quickly become destroyed. The key to the KSTAR’s achievement was the use of Internal Transport Barriers (ITB) to help control the confinement and stability of the reaction.

The KSTAR is not the only reactor pushing the boundaries of energy science. Similar tokamak devices, such as the UK’s MAST, France’s ITER or China’s EAST are also trying to unlock the secrets of fusion. In fact, it's already possible KSTAR’s record has been smashed. The Korean team themselves may have broken their own record last year, while China claims its reactor achieved temperatures of 120 million degrees Celsius for 101 seconds in 2021. Other approaches, such as Germany’s Wendelstein 7-X device, uses stellarators - AI-designed magnetic coils which keep the superheated plasma in place. Although these make more stable reactions than tokamaks, they are harder to heat.

In any case, such breakthroughs are a significant step towards achieving fusion energy. Some researchers suggest a fully functioning fusion reactor could be viable by the 2040s.

Fusion vs Fission

For some, talk of ‘nuclear’ fusion may cause alarms to ring. With the Chernobyl and Fukushima disasters still in recent memory, some states - such as Germany - have sought to move away from nuclear energy. Although nuclear fission can reliably produce large amounts of energy, it comes with significant meltdown risks if mismanaged, produces longterm radioactive waste and could be used as a stepping stone for nuclear weapons.

However, it’s important to note that nuclear fusion is not the same as fission. Both use nuclear processes (i.e. the manipulation of the nuclei of atoms), but beyond this there are little similarities. In fact, the International Atomic Energy Agency suggests fusion energy could be one of the most environmentally friendly sources of energy.

Whereas nuclear fission involves the splitting of a heavier element into lighter ones, fusion joins two lighter elements into a heavier one. In both cases, energy is freed because of the differences in the nuclear binding masses. Within fusion energy, the difference or “missing mass” is converted into energy via Einstein’s equation of E=mc2. Since ‘c’ (the speed of light) is very large, even a small amount of missing mass can be turned into a huge amount of energy.

While nuclear fission uses radioactive fuels such as uranium 235 - a relatively rare element that must be enriched from more common uranium - fusion mainly uses isotopes from hydrogen, such as deuterium and tritium. These hydrogen isotopes are then superheated in reactors to a high enough temperature to overcome the so-called Coulomb barrier and fuse together. For enough energy to be produced, this reaction must be maintained for long enough to prevent cooling.

Fusion has several distinct advantages over fission. Its base fuel - hydrogen - is the most readily available element in the universe and non-radioactive. It requires no mining, can be easily synthesised from water and is not toxic. The process of fusion also creates no greenhouse gases or carbon dioxide, with the only byproduct being helium, a valuable gas in short global supply, and tritium - a slightly radioactive - isotope. Unlike the radioactive waste from fission power, tritium has a short half-life (12.3 years) after which it can be used again in power generation.

Fusion is also considered inherently safe and there is no risk of a meltdown. In fission energy, the issue is not producing high amounts of heat but containing it. Since nuclear fission relies on a chain reaction, vast amounts of coolant are required to keep the temperature stable. If this coolant becomes exhausted or is otherwise unavailable, a fission reactor can quickly exceed its safety perimetres - resulting in a meltdown.

The opposite is true with fusion. Instead, the challenge is maintaining the temperature in the first place. If a fusion reactor was damaged or otherwise impaired, the temperature would drop and the reaction simply shut down.

Finally, fusion reactors cannot be used to develop weapons. Although hydrogen bombs use the same methods of nuclear fusion, they require a fission bomb to generate the required heat. Furthermore, since the fusion fuel is constantly injected and consumed, it is never in a sufficient amount to produce the instantaneous power needed for a nuclear explosion.

This may now all be sounding too good to be true, but of course there are downsides of nuclear fusion. Firstly, although deuterium is readily available in water, the main source of tritium is from nuclear fission reactors. So a small number of fission reactors - with all their inherent risks - would be needed to supply tritium for fusion reactors, at least until a large enough supply is developed. In some cases lithium is also used in reactions, a metal which must be mined.

Another issue is the radiation of the components themselves. The intense neutron fluxes of the reactor could damage its walls, creating longer term radioactive waste and potentially leaking out. Blanketing materials are being developed to minimise this risk, but it remains a continual concern.

But perhaps the biggest hurdle is the science itself. There is still much to be done to fully understand the process of nuclear fusion. In fact, the full function of the above-mentioned ITBs are still not fully understood by researchers themselves. These scientific hurdles combined with the huge economic cost has created the scientific maxim that “fusion energy is always thirty years away”.

But of course, the climate crisis is happening now. Although fusion may appear to be the perfect silver bullet for near limitless energy, we can not guarantee it will come to our rescue any time soon. Although pursuing fusion is still an admirable and worthwhile endeavour, we must remember there are other methods available now that can be explored and expanded.

" ["post_title"]=> string(78) "New Breakthroughs in Nuclear Fusion Announced - But How Clean is Fusion Power?" ["post_excerpt"]=> string(144) "Fusion power is still some decades away, although new strides are being made every year. But how does fusion compare with nuclear fission power?" ["post_status"]=> string(7) "publish" ["comment_status"]=> string(6) "closed" ["ping_status"]=> string(6) "closed" ["post_password"]=> string(0) "" ["post_name"]=> string(75) "new-breakthroughs-in-nuclear-fusion-announced-but-how-clean-is-fusion-power" ["to_ping"]=> string(0) "" ["pinged"]=> string(0) "" ["post_modified"]=> string(19) "2023-01-25 13:27:08" ["post_modified_gmt"]=> string(19) "2023-01-25 11:27:08" ["post_content_filtered"]=> string(0) "" ["post_parent"]=> int(0) ["guid"]=> string(27) "https://reset.org/?p=104614" ["menu_order"]=> int(0) ["post_type"]=> string(4) "post" ["post_mime_type"]=> string(0) "" ["comment_count"]=> string(1) "0" ["filter"]=> string(3) "raw" } [3]=> object(WP_Post)#6681 (24) { ["ID"]=> int(95347) ["post_author"]=> string(3) "390" ["post_date"]=> string(19) "2021-12-19 05:00:00" ["post_date_gmt"]=> string(19) "2021-12-19 05:00:00" ["post_content"]=> string(10232) "

To most people, the name Nvidia does not likely mean much, but ask a video gamer, and you’ll probably get a much more enthusiastic response. Since 1993, Nvidia has been at the forefront of the video game hardware market, and has developed a range of high-end graphic cards for gaming PCs and consoles.

Generally speaking, Nvidia has more in common with Call of Duty than climate research, but the US-based corporation is now looking to apply its expertise in simulation and processing power to helping the environment.

Nvidia recently announced Earth-2, a new supercomputer which they claim can realistically model the Earth to a metre scale. The project is still very much in the development stage, but already it is being suggested such computer simulations could greatly assist climate research and environmental policy.

Computer simulations of the Earth are not entirely new, such as the Earth Simulator series developed in Japan from 1999. But currently, most climate simulations have a resolution of around 10 to 100 kilometers. Although this is useful for wide ranging weather patterns and systems, it does not necessarily address the full spectrum of our environmental woes. 

By zooming into a metre-scale resolution, Earth-2 could theoretically more accurately simulate elements such as the global water cycle and the changes which lead to intensifying storms and droughts. With metre-scale resolutions, towns, cities and nations will also be able to receive early warnings about climate events, and plan infrastructure improvements or emergency response on a street-by-street, or even building-by-building basis. Eventually, such simulations could plug into decades-old climate data and provide a broad ‘digital twin’ of the Earth for researchers to experiment with, such as examining how clouds reflect back sunlight.

Reaching this new resolution would require machines with millions to billions of times more computing power than is currently available. And with processing power generally increasing ten times every five years, that would take millenia to achieve. However, Nvidia claims this giant leap is possible if several different technologies and disciplines are combined. By bringing together GPU-accelerated computing (Nvidia’s core competence) with deep learning artificial intelligence and physics-informed neural networks,vast amounts of information can be run through giant room-sized supercomputers.

The plan is for Earth-2 to then be plugged into Nvidia’s Omniverse platform (not to be confused with the ‘metaverse’) which is an open platform developed for virtual collaboration and real-time physically accurate simulations. Although largely geared towards Nvidia’s main market - video games and media - such powerful simulations can also play important roles in scientific research, architecture, engineering and manufacturing.

Nvidia already has some experience in this field. Earlier this year, they unveiled Cambridge-1, a 100 million USD supercomputer constructed in the UK. Built in only 20 weeks - as opposed to the two years usually needed for supercomputers - Cambridge-1 aims to provide a collaborative simulation platform for healthcare research, universities and hospitals.

https://youtu.be/lnnIX3OQAzQ

Rise of the Supercomputers

At their essence, supercomputers are essentially computers with an extremely high-level of performance compared to a general desktop computer. Whereas most consumer computers deal in ‘millions of instructions per second (MIPS)’, supercomputers are graded by ‘floating-point operations per second (FLOPS)’. To put it simply, supercomputers are hundreds of thousands of times faster and more powerful than a generic computer, with the most powerful supercomputers being able to perform up to  27,000 trillion calculations per second.

This is the kind of power you need if you want to model increasingly complex simulations, such as the physics of space or engage in quantum mechanics or fission research. As the technology develops, these simulations can be run at increasingly larger resolutions and scales. In the 1960s, climate modelling was only possible in regards to the oceans and land vegetation. Now, models have been developed for ice sheet movement, atmospheric chemistry, marine ecosystems, biochemical cycles and aerosol dispersal, among others.

For example the Plymouth Marine Laboratory makes use of the Massive GPU Cluster for Earth Observation (MAGEO) supercomputer, which allows it to predict wildfires, detect oil spills and plastics, as well as map mangroves and biodiversity. Today, there are around 500 supercomputers of varying power across the globe.

But, supercomputers are not just ‘super’ in terms of processing power, they’re also super-sized. Usually, supercomputers consist of large banks of processors and memory modules networked together with high-speed connections. The largest supercomputers also utilise parallel processing, which allows a computer to tackle multiple parts of a problem at once. This means supercomputers are usually the size of entire rooms.

But, all this processing power also comes with massive energy demands and huge quantities of wasted heat. This means supercomputers themselves are not necessarily the most environmentally friendly research tools. For example, Japan’s Fugaku supercomputer uses more than 28 megawatts of power to perform its function. That’s the equivalent of tens of thousands of homes in the United States. Australian astronomers calculated that supercomputer use was their single biggest carbon producer, three-times that of flying. Of course, the amount of carbon generated will depend on how power is generated within a nation in the first place, with greener nations’ supercomputers generally producing less carbon.

However, steps are being taken to improve their energy efficiency. For example, more efficient code will mean less processing power will be needed, dropping energy demands, while building supercomputers in chillier locations, such as Iceland, means their excess heat can be used for other purposes. Furthermore, few are suggesting this energy use is not justified, as the work supercomputers are conducting can be used to help develop environmental protections.

Also, it should be pointed out, that compared to general computer use, supercomputer energy use is miniscule. As Dan Stanzione, executive director of the Texas Advanced Computing Center at the University of Texas at Austin, explains:

“Supercomputers are also just a tiny fraction of all computing. Large data centres in the United States alone would emit about 60 million tonnes, so all the supercomputers in the world would be less than 5 per cent of the power we use for email and streaming videos of cats.”

" ["post_title"]=> string(89) "Can Video Game Hardware Help Climate Research? Nvidia to Create a 'Digital Twin' of Earth" ["post_excerpt"]=> string(142) "The manufacturer of graphics cards wants to use its know-how to recreate the Earth in a digital simulation on a unprecedented one metre scale." ["post_status"]=> string(7) "publish" ["comment_status"]=> string(6) "closed" ["ping_status"]=> string(6) "closed" ["post_password"]=> string(0) "" ["post_name"]=> string(102) "can-video-game-hardware-help-climate-research-nvidia-to-create-a-metre-scale-digital-twin-of-the-earth" ["to_ping"]=> string(0) "" ["pinged"]=> string(0) "" ["post_modified"]=> string(19) "2022-07-21 13:48:07" ["post_modified_gmt"]=> string(19) "2022-07-21 13:48:07" ["post_content_filtered"]=> string(0) "" ["post_parent"]=> int(0) ["guid"]=> string(26) "https://reset.org/?p=95347" ["menu_order"]=> int(0) ["post_type"]=> string(4) "post" ["post_mime_type"]=> string(0) "" ["comment_count"]=> string(1) "0" ["filter"]=> string(3) "raw" } [4]=> object(WP_Post)#6711 (24) { ["ID"]=> int(93633) ["post_author"]=> string(3) "390" ["post_date"]=> string(19) "2021-08-18 14:25:00" ["post_date_gmt"]=> string(19) "2021-08-18 14:25:00" ["post_content"]=> string(11525) "

Scientific projects come in all shapes and sizes, with varying degrees of support, both financial and practical. With the aid of modern tools, even small teams can gain vast amounts of data, perhaps even too much. Sifting through reams of information, such as photographs, satellite imagery or historical records, could take a small research years, if not decades, to complete on their own. Luckily, a collaborative citizen science website is helping research teams connect with thousands of willing volunteers who help carry the burden, and for free.

Zooniverse is an online collaborative platform that aims to provide people-powered research to research teams of all sizes. Their approach is fairly straightforward: over a million users have volunteered their time to work through diverse datasets related to a wide range of research topics, from climate protection, to history and literature.

Volunteers require no previous experience, but work through data following a simple set of instructions, transcribing information or answering basic questions. The entire project functions via the ‘wisdom of the crowd’ concept which suggests accuracy increases as more people look at a particular piece of data. Once enough contributions have been made, the Zooniverse platform can also make estimates on how likely errors were to occur, further assisting in refining the process. In some cases, the results are used to train artificial intelligence systems. Zooniverse also provides additional communication tools for the researchers and collaborators, including forums, to ensure they can respond to queries or other questions.

As a result of Zooniverse collaboration, hundreds of research papers have been submitted, especially in the areas of space and biodiversity. Although volunteers can get involved in a wide range of topics - from transcribing historical criminal records to listening to mysterious bursts of radiation from space - below are some of the climate science-related projects to get involved in:

NASA Globe Cloud Gaze

Although satellite imagery can provide a great overview of weather systems, they can always be more accurate and contextualised better. Within the NASA-partnered CLOUD GAZE project, volunteers examine photographs of the sky and provide feedback on the amount and type of cloud cover in the photograph. This is then used to create comparisons with other data sources like satellites, surface weather reports or even weather and climate computer models.

Old Weather - WWII

Assessing today’s climate change largely depends on having reliable information on weather systems in the past. Unfortunately, for researchers, humans have only been accurately recording this information since at least the 19th century. Often, those most concerned with recording the weather were sailors, whether they be whalers, military or civilian vessels. The Old Weather project is attempting to decode their historical weather records to better understand the condition of the seas as late as 1849. Currently, the project is trawling through US naval records during the Second World War, and have previously covered 19th century whalers as well as arctic expeditions.

Penguin Watch

Monitoring the conditions of penguins cannot always be easy, largely because they like to gather and nest on remote islands and in inhospitable regions. The Penguin Watch project is asking volunteers to work through thousands of photos from automated photo traps set up in 100 locations around the South Ocean and Arctic peninsula. All together these produce 8000 photos a year, many of them teeming with penguins. The team needs volunteers to simply count the penguins they see, or alternatively, see if they can spot eggs, checks or empty nests.

Artificial Intelligence vs Brain Power

Much has been made in recent years about the powerful potential of artificial intelligence to assist in research and analysis of all kinds, from spotting animals, to developing more efficient rural electrification projects

When it comes to analysing big data, AI is likely unsurpassed, however human brain power also comes with some advantages. Whereas an AI can spot patterns and make inferences, it is still very much limited to its original learning and cannot think laterally as we humans do. When unexpected variables arrive, artificial intelligence often cannot compute them, while humans can contextualise and understand them. Furthermore, establishing a major AI platform is not cheap. Although there are an increasing number of free open source AIs, bespoke AIs can cost anywhere from 6000 to 300,000 USD, placing them likely outside the budgets of smaller research teams. 

Furthermore, projects like Zooniverse not only provide an alternative source of big data analysis, but also create an important bond between researchers and laypeople, allowing them to be involved and take a degree of ownership of scientific and cultural developments.

This article is part of our Special Feature "Civic Tech - Ways Out of the Climate Crisis with Digital Civic Engagement". You can find all articles of the Special Feature here: Special Feature Civic Tech

The Special Feature is part of the project funding of the German Federal Environmental Foundation (Deutsche Bundesstiftung Umwelt - DBU), in the framework of which we are producing four special features over two years on the topic of "Opportunities and potentials of digitalisation for sustainable development".

More information here.

" ["post_title"]=> string(121) "Zooniverse: A Million Volunteers are Helping to Spot Animals, Transcribe Records and Weather Watch in the Name of Science" ["post_excerpt"]=> string(190) "Whether its counting penguins, deciphering historical records or listening to the stars, Zooniverse harnesses people power to assist in breaking down the big data behind scientific research." ["post_status"]=> string(7) "publish" ["comment_status"]=> string(6) "closed" ["ping_status"]=> string(6) "closed" ["post_password"]=> string(0) "" ["post_name"]=> string(119) "zooniverse-a-million-volunteers-are-helping-to-spot-animals-transcribe-records-and-weather-watch-in-the-name-of-science" ["to_ping"]=> string(0) "" ["pinged"]=> string(0) "" ["post_modified"]=> string(19) "2022-07-20 08:43:47" ["post_modified_gmt"]=> string(19) "2022-07-20 08:43:47" ["post_content_filtered"]=> string(0) "" ["post_parent"]=> int(0) ["guid"]=> string(26) "https://reset.org/?p=93633" ["menu_order"]=> int(0) ["post_type"]=> string(4) "post" ["post_mime_type"]=> string(0) "" ["comment_count"]=> string(1) "0" ["filter"]=> string(3) "raw" } [5]=> object(WP_Post)#6713 (24) { ["ID"]=> int(39943) ["post_author"]=> string(3) "380" ["post_date"]=> string(19) "2020-04-02 12:06:52" ["post_date_gmt"]=> string(19) "2020-04-02 12:06:52" ["post_content"]=> string(16362) "What potential does artificial intelligence have to help us protect the environment and tackle climate change? And with all the computing power it requires, how can we make artificial intelligence itself more environmentally friendly? What can companies, developers and governments do to ensure AI helps us protect - and not destroy - the environment? We put our questions to Stephan Richter from the Institute for Innovation and Technology.Artificial intelligence is one of the biggest buzzwords of the day. Experts agree that the burgeoning technology will significantly change the world as we know it. In fact, it already is. We use AI applications in our everyday lives today - whether it's in the form of handy translation programmes or for image recognition.Many believe that the technology could have huge potential when it comes to protecting the environment and tackling climate change, with a huge number of projects already working using "AI for good". AI applications are already being used to shrink the carbon footprint of buidings, reduce food waste and protect the rainforest. But at the same time, emachine learning itself consumes enormous amounts of energy. Artificial intelligence - just like most other technologies - cannot be labelled simply as "good" or "bad" per se.AI is a tool, not a solution. It's up to us to decide what its impact will be. And that's why it's crucial we address how best to design it to ensure it has a positive impact on our planet.
We talked about the pitfalls and possible potentials of AI with Stephan Richter, one of the authors of the short study "Artificial Intelligence in the Environmental Sector" (link in German). In the study, carried out in 2019 by the Institute for Innovation and Technology (iit) on behalf of the German Environment Agency (Umweltbudesamt), the team of authors takes a look at AI use cases and considers future questions of sustainability.

Stephan, in the study for the Federal Environment Agency, you looked at the potential of AI for environmental sustainability. Which areas did you identify as having the most potential?

In our study we focused on different development paths where AI has the potential for positive impact, mainly on the basis of interviews with experts. Of course, this by no means provides a complete picture - and we're not able to demonstrate where the greatest potential lies. But there are other studies, such as the excellent study "Tackling Climate Change with Machine Learning", which was published by David Rolnick and other colleagues at the end of 2019.Focusing on different sectors, it looks at what is happening in the field of machine learning in a sustainability context. There are two strands of thought: How can AI be used to mitigate climate change and how can AI be used to adapt to the consequences of climate change. Interestingly, the authors of that study also aren't able to make any scientifically-based statements like: These are the top potentials, or these are the "low hanging fruits". Unfortunately, there is no one solution that stops climate change or one adaptation strategy that offers us the ultimate answer. But there are many small, very partial solutions that, if used intelligently, could contribute to sustainability.

Which sectors are seeing a lot of innovation in AI and machine learning?

In the energy sector, for example, a fairly large sector that naturally releases a lot of greenhouse gases, the focus is on predicting energy consumption. So: How is energy consumed in a city or in many small households? When is the energy network used to capacity and how much? AI can be used for these kind of very complex systemsWhen it comes to the mobility sector, it's about predicting traffic. So how can you positively influence traffic in a city, for example, if you know in advance how it will develop in the next few minutes or hours? But in the mobility sector it is also about identifying patterns. For example, as an energy provider, I need to know at where and when electric cars will be charged - because then I might suddenly need a whole lot of energy. Machine learning can help there too.

But when I use AI to save resources, there might be unexpected rebound effects that aren't necessarily positive...

True. An example comes to mind from the aviation industry, where AI can be used to improve wind forecasting and ultimately optimise flight routes and save fuel.

It sounds positive at first, but there is a catch?

If I save fuel, then of course I also save money. And then we have the problem of the so-called Jevons paradox. If we increase efficiency, i.e. use less fuel, then the airline can make more profits. In other words, it could offer flights more cheaply in the competitive environment in which it operates. That could lead to more demand for flights. And that in turn could lead to the fact that the savings in kerosene, which we originally achieved by using AI to calculate optimised flight routes, would be lost due to the increased demand. In a worst case scenario,  if the increased demand were to exceed the savings potential, that could even have negative effects on the climate. That's not an AI-specific problem, of course, but an economic and system-specific problem.

© Anna Tiessen Stephan Richter, wissenschaftl. Mitarbeiter am iit, hat an der UBA-Studie „Künstliche Intelligenz im Umweltbereich“ mitgewirkt.

Getting back to your study, you have mainly looked at the sustainability potential of artificial intelligence. But the training AI is an incredibly energy- and data-intensive process. How can artificial intelligence itself be made sustainable?

That's a tricky question. You have to look at what kind of sustainability you're talking about. Is it economic sustainability? Then there soon comes the demand for easy data accessibility, or for new and different rules on data ownership so that everyone can access everything. This is very questionable if you put that back into the context of social sustainability. However, when it comes to environmental sustainability, I think we should put the issue of AI in a broader context, namely in the context of digitalisation and capitalism. So we cannot make AI sustainable if we do not make digital capitalism sustainable.

And how can digitalisation, or rather, digital capitalism be made sustainable?

There are already approaches to solving this problem. For example, there is a legislative framework like Germany's carbon pricing system, which was passed in 2019 and is an environmental tax on carbon dioxide emissions. One idea would be to further expand and tighten up approaches like that. In the future, companies would have to fully offset their product and service-related CO2 emissions, by taking compensatory measures, for example. A binding framework at European, or even better, international level would be a big step towards sustainable digitalisation and also towards sustainable AI.In very simplified terms, we can keep in mind that the "problems" we have to address in machine learning in the field of environmental sustainability are mainly energy problems. And energy is not usually produced in a 100% sustainable way. You still have a large amount of coal and of nuclear power - and that is definitely not sustainable and not desirable.

This high energy consumption is mostly due to the training of AI applications. How can that training be made more energy efficient?

There are more publications on how to improve life cycle assessments using artificial intelligence than there are on life cycle assessments of AI itself. We need a lot more research into life cycle assessments of AI. One often cited work, lead authored by Strubell at the University of Massachusetts in 2019, showed that training or teaching an AI in speech recognition consumes about as much CO2 as five cars over their entire life cycle, i.e. from creation to operation and disposal. That sounds like a lot of energy at first, and it is. But the question is: What do you do with the result, so how are the AI tools being used? How are they being applied and how large is the impact?

Do you have an example?

The "Stena Fuel Pilot" project, for example, uses AI to optimise shipping routes. In this case, AI was used to calculate how to cover the best possible route between Kiel and Gothenburg with as little fuel as possible. It was shown that two to three percent of fuel costs can be saved. Teaching the AI was certainly expensive and energy-intensive; if the tool were to be used in just one ferry, it would certainly not be worth it. However, if the pilot project were scaled up and the AI tool transferred to the entire fleet, each ferry would save a few percent of fuel on each crossing. If the developed technology were then operated for a long period of time, that makes the impact even greater. Although the training of the AI was energy-intensive, the highly-scaled use of the AI tool in this example ultimately pays off and has a positive climate impact.

What possibilities are there for a "greener", more sustainable design of AI - for example, for companies or developers?

Training AI tools or machine learning tools uses a lot of energy - when it comes to the hardware, the cloud providers and computing times. The location of the data centre also plays an important role. Germany, Switzerland and the USA all have a different energy mix. In these contexts, for example, as a programmer, I can have at least some influence. Do I work with the most efficient hardware or algorithms that are as efficient as possible? When I look at a cloud provider, I can check whether it uses green energy, whether it is carbon neutral, and what its sustainability commitments look like. And I can also get an overview of the location of the data centre. There's a nice paper about that, that resulted in the Machine Learning Emission Calculator. You can enter parameters such as hardware, cloud provider, runtime and location to get a rough estimate of how much CO2 is actually emitted by your application. It's certainly not a conclusive solution, but it's a small, interesting project developed by the AI community to help raise awareness for the topic.

Are there ways to make the training of AI applications or machine learning less energy-intensive in themselves?

Well, at the end of the day, machine learning is about an algorithm that can detect certain patterns by digging through data. So if you don't tell the algorithm where to look for or what parameters to optimise, than it can end up looking through a huge number of different patterns and parameters in a data set.For example, if you ride home from here on a bicycle, you have two or three parameters that you pay attention to in order to get home as quickly or relaxed as possible or as safe as possible by other participants on the road. As a human being, this can be done quite well. If you have ten parameters, it becomes more complex. And if you have 100 parameters that you want to include, and which can all interact with each other, then that is barely possible for a human being. To do this, you could use a machine learning algorithm that analyses, based on data, which parameters, how, could be interrelated and have an impact. However, if the algorithm does not know what to look for, it scours the entire data set extremely intensively. But if we tell it where to search, i.e. restrict the search field, then it has to search less for parameters in that whole mass of data. That means that you can ultimately save energy by knowing which parameters you want to optimise and how.

And what about data efficiency? Training machine learning algorithms always requires huge amounts of data – or can this be reduced?

You always need data sets to train AI. At the end of the day, all you're doing is building a statistical model using AI to detect or even predict patterns in a data set. However, it is important to select and pre-sort the data sets beforehand. So if you know exactly where you want to go, you can limit the search field, so the AI requires less computing power. This is important for two reasons: it both reduces the work load and uses less energy.

And what can governments do to make AI applications less energy-hungry?

The right way is to invest in the context of research and development, for example to develop more efficient AI algorithms and hardware. And then we have to take a look at digitalisation as a whole. While research and development in the area of efficiency improvement is helping, we also need regulation when it comes to CO2. If every gram of CO2 had a price tag or had to be compensated in some way, then science, business, and individual citizens, who would end up carrying those costs, would work to ensure that digitalisation were as sustainable as possible. That also means that we should primarily be applying AI in use cases where we predict that it will have an overall positive environmental or social impact.

This is a translation of an original article that first appeared on RESET's German-language site.

This article is part of the RESET Special Feature "Artificial Intelligence - Can Computing Power Save Our Planet?" Explore the rest of our articles in the series right here.

" ["post_title"]=> string(60) "Interview: How Can We Make AI More Environmentally Friendly?" ["post_excerpt"]=> string(451) "

What potential does artificial intelligence have to help us protect the environment and tackle climate change? And with all the computing power it requires, how can we make artificial intelligence itself more environmentally friendly? What can companies, developers and governments do to ensure AI helps us protect - and not destroy - the environment? We put our questions to Stephan Richter from the Institute for Innovation and Technology.

" ["post_status"]=> string(7) "publish" ["comment_status"]=> string(6) "closed" ["ping_status"]=> string(6) "closed" ["post_password"]=> string(0) "" ["post_name"]=> string(65) "interview-how-can-artificial-intelligence-be-sustainable-03312020" ["to_ping"]=> string(0) "" ["pinged"]=> string(0) "" ["post_modified"]=> string(19) "2023-05-31 13:19:57" ["post_modified_gmt"]=> string(19) "2023-05-31 11:19:57" ["post_content_filtered"]=> string(0) "" ["post_parent"]=> int(0) ["guid"]=> string(102) "http://reset.org/blog/interview-wie-kann-kuenstliche-intelligenz-nachhaltig-gestaltet-werden-03312020/" ["menu_order"]=> int(805) ["post_type"]=> string(4) "post" ["post_mime_type"]=> string(0) "" ["comment_count"]=> string(1) "0" ["filter"]=> string(3) "raw" } [6]=> object(WP_Post)#6795 (24) { ["ID"]=> int(36254) ["post_author"]=> string(3) "368" ["post_date"]=> string(19) "2017-05-10 02:01:19" ["post_date_gmt"]=> string(19) "2017-05-10 02:01:19" ["post_content"]=> string(3267) "

The Barilla Center for Food and Nutrition's YES! competition is for PhD and post doc researchers who are looking to improve the sustainability of our food system. A 20,000 EUR research grant for one year is up for grabs.

The Barilla Center for Food and Nutrition (BCFN) Foundation seeks to support learning and research that can help improve our food system by ”reducing hunger, fighting food waste, and promoting healthy lifestyles and sustainable agriculture”.

Their BCFN YES! (Young Earth Solutions) Research Grant Competition is particularly designed to engage young people and young thinking into solving such challenges. The competition seeks to reward promising research projects which concretely promote a more environmentally, socially and economically sustainable agri-food system - one which also prioritises public health.

Research projects may focus on 'Sustainable and healthy diets', 'Urban food systems and policies', 'Resilient agriculture, land use change and agroecology', 'The nexus between climate change, energy and food', or 'Youth and women's involvement in agriculture', to mention a few of the research areas of particular interest to the foundation. Check out their website for the full list of different research fields.

A jury panel of six experts operating within the fields of nutrition, agriculture, ecology, rural development and bio-science will evaluate projects on the basis of their innovative thinking and potential for global impact.

Last year's BCFN YES! grant went to the University of the West Indies, Mona Campus, Jamaica, for their climate adaptation research and knowledge transfer project aimed at reducing the drought vulnerability of small-farmers in Jamaica.

Who can apply and when is the deadline?

Young researchers up to the age of 35 years (by the end of 2017) who have either completed, or are working on their PhD - with an enrollment date starting from 1st January 2011.

Individual researchers as well as multidisciplinary and cross-national research teams of up to three people are encouraged to apply.

A maximum of three research projects may each scoop up this year's generous 20,000 EUR research grant. Applications should be submitted online by 28th June 2017.

And here's a video with a few highlights from last year's competition.

" ["post_title"]=> string(69) "The BCFN YES! Research Grant Competition Deadline Is Fast Approaching" ["post_excerpt"]=> string(227) "

The Barilla Center for Food and Nutrition's YES! competition is for PhD and post doc researchers who are looking to improve the sustainability of our food system. A 20,000 EUR research grant for one year is up for grabs.

" ["post_status"]=> string(7) "publish" ["comment_status"]=> string(6) "closed" ["ping_status"]=> string(6) "closed" ["post_password"]=> string(0) "" ["post_name"]=> string(70) "bcfn-yes-research-grant-competition-deadline-fast-approaching-05092017" ["to_ping"]=> string(0) "" ["pinged"]=> string(0) "" ["post_modified"]=> string(19) "2021-09-10 15:30:39" ["post_modified_gmt"]=> string(19) "2021-09-10 15:30:39" ["post_content_filtered"]=> string(0) "" ["post_parent"]=> int(0) ["guid"]=> string(93) "http://reset.org/blog/bcfn-yes-research-grant-competition-deadline-fast-approaching-05092017/" ["menu_order"]=> int(1887) ["post_type"]=> string(4) "post" ["post_mime_type"]=> string(0) "" ["comment_count"]=> string(1) "0" ["filter"]=> string(3) "raw" } } ["post_count"]=> int(7) ["current_post"]=> int(-1) ["before_loop"]=> bool(true) ["in_the_loop"]=> bool(false) ["post"]=> object(WP_Post)#6712 (24) { ["ID"]=> int(115927) ["post_author"]=> string(5) "21506" ["post_date"]=> string(19) "2025-03-19 07:00:00" ["post_date_gmt"]=> string(19) "2025-03-19 05:00:00" ["post_content"]=> string(9770) "

Planes? Nope. Agriculture? Try again. It’s actually our buildings that are among the biggest culprits of energy consumption and emissions. The numbers are truly staggering. The latest figures from the International Energy Agency reveal that the operations of buildings alone account for 30 percent of global final energy consumption. The direct CO2 emissions from buildings hit three gigatonnes in 2022. To put that in perspective, the aviation industry contributed almost 800 megatonnes in the same year—73 percent less than building emissions. 

Think your thermostat is smart? ecobee is getting smarter

Over 90 percent of the energy used in residential buildings is devoted to space heating and hot water. Making existing buildings more energy-efficient is the obvious solution. Indeed, to achieve a climate-neutral Europe by 2050, the EU has pledged to shift from "nearly-zero energy buildings" to "zero-emission buildings" for new construction starting in 2028.

On a consumer level, demand for greener and more efficient homes is also gaining traction. Devices such as smart thermostats—Wi-Fi-enabled devices that learn your household routines and optimise heating and cooling accordingly—have been hitting the shelves with breakneck speed in recent years. Smart thermostats are big business; the market is projected to reach US$5.15bn in 2025.

One such manufacturer is ecobee. Aside from launching the world’s first smart thermostat in 2009, ecobee has also made a significant impact on reducing energy and e-waste. They ensure their products last and “design for easy disassembly and repair and resell thousands of thermostats per year.” According to their own data, customers across North America have saved over 41.2 terawatt-hours (TWh) of energy with ecobee’s smart thermostats—the equivalent of taking all the homes in New York City off the grid for an entire year. 

The company is now changing tack. Leveraging the popularity and proliferation of smart thermostats across North America, they’re now developing an entirely new way to tackle our buildings’ planetary burden: data donation.

Energy and money savings vary depending on the settings used, and average savings are complicated to elicit. However, the average American home is said to save approximately 8 percent on its annual heating and cooling bills with a smart thermostat, and UK customers save between 8.4 and 16.5 percent of their heating energy use. Some estimates cite savings as high as 31 percent

Powering science with user data

ecobee’s data donation project allows users to help researchers analyse energy consumption patterns on an unprecedented scale. Currently, those looking to understand and optimise home energy usage only have access to data from a handful of homes. ecobee hopes they can boost this number to over 200,000 with the program.

The data, which is stripped of any personally identifiable information, includes insights such as home size, temperature settings, occupancy schedules and Heating, Ventilation and Air Conditioning (HVAC) runtimes. Researchers at universities, government agencies and NGOs then use this information to develop better energy-saving strategies. The hope is that, once the findings are shared with them, these researchers will then publicly publish their findings to ensure the data continues to provide use. Ecobee receives no financial compensation for this data. ecobee declined to comment for this article.

“You can’t manage what you can’t measure”

While smart devices and their data optimise energy use, they’re far from a silver bullet in the fight against climate change. Issues surrounding structural inefficiencies of poorly insulated buildings, outdated heating systems or urban planning prioritising carbon-intensive materials remain unaddressed. These issues represent the lion’s share of our building’s emissions. Even if every human being on the planet donated their data, there’s simply only so “efficient” our homes can be. Smart tech can certainly help trim energy waste. But, deep emissions cuts will require larger-scale policy shifts and infrastructure investments, including a sizeable reduction in our reliance on non-renewable energy sources. 

However, by contributing real-world insights at scale, devices like ecobee’s could well be shifting the conversation from isolated efficiency gains to systemic change. This change couldn’t come sooner. With an ever-increasing number of people living in urban environments and the risk of missing our climate targets becoming likelier than ever, we need all the data we can get to make our homes as sustainable as possible. With every adjusted degree and every shared data point, we’re not just warming or cooling our homes more intelligently—we’re rewriting the blueprint for how buildings consume resources.

" ["post_title"]=> string(82) "Your Smart Thermostat Data Could Tackle One of Earth’s Biggest Emissions Sources" ["post_excerpt"]=> string(145) "Data privacy is on everyone's lips. But, could your smart thermostat data help to reduce emissions from the building sector's enormous footprint?" ["post_status"]=> string(7) "publish" ["comment_status"]=> string(6) "closed" ["ping_status"]=> string(6) "closed" ["post_password"]=> string(0) "" ["post_name"]=> string(79) "your-smart-thermostat-data-could-tackle-one-of-earths-biggest-emissions-sources" ["to_ping"]=> string(0) "" ["pinged"]=> string(0) "" ["post_modified"]=> string(19) "2025-05-21 12:23:31" ["post_modified_gmt"]=> string(19) "2025-05-21 10:23:31" ["post_content_filtered"]=> string(0) "" ["post_parent"]=> int(0) ["guid"]=> string(27) "https://reset.org/?p=115927" ["menu_order"]=> int(0) ["post_type"]=> string(4) "post" ["post_mime_type"]=> string(0) "" ["comment_count"]=> string(1) "0" ["filter"]=> string(3) "raw" } ["comment_count"]=> int(0) ["current_comment"]=> int(-1) ["found_posts"]=> int(7) ["max_num_pages"]=> int(1) ["max_num_comment_pages"]=> int(0) ["is_single"]=> bool(false) ["is_preview"]=> bool(false) ["is_page"]=> bool(false) ["is_archive"]=> bool(true) ["is_date"]=> bool(false) ["is_year"]=> bool(false) ["is_month"]=> bool(false) ["is_day"]=> bool(false) ["is_time"]=> bool(false) ["is_author"]=> bool(false) ["is_category"]=> bool(false) ["is_tag"]=> bool(false) ["is_tax"]=> bool(true) ["is_search"]=> bool(false) ["is_feed"]=> bool(false) ["is_comment_feed"]=> bool(false) ["is_trackback"]=> bool(false) ["is_home"]=> bool(false) ["is_privacy_policy"]=> bool(false) ["is_404"]=> bool(false) ["is_embed"]=> bool(false) ["is_paged"]=> bool(false) ["is_admin"]=> bool(false) ["is_attachment"]=> bool(false) ["is_singular"]=> bool(false) ["is_robots"]=> bool(false) ["is_favicon"]=> bool(false) ["is_posts_page"]=> bool(false) ["is_post_type_archive"]=> bool(false) ["query_vars_hash":"WP_Query":private]=> string(32) "6d49c6ec81a058ef6fefecdc69a99240" ["query_vars_changed":"WP_Query":private]=> bool(true) ["thumbnails_cached"]=> bool(false) ["allow_query_attachment_by_filename":protected]=> bool(false) ["stopwords":"WP_Query":private]=> NULL ["compat_fields":"WP_Query":private]=> array(2) { [0]=> string(15) "query_vars_hash" [1]=> string(18) "query_vars_changed" } ["compat_methods":"WP_Query":private]=> array(2) { [0]=> string(16) "init_query_flags" [1]=> string(15) "parse_tax_query" } ["query_cache_key":"WP_Query":private]=> string(84) "wp_query:5888473913ae12e68c7f0b5401781288:0.76686000 17481354540.80754700 1748135421" } object(WP_Term)#6801 (11) { ["term_id"]=> int(14444) ["name"]=> string(8) "research" ["slug"]=> string(8) "research" ["term_group"]=> int(0) ["term_taxonomy_id"]=> int(14444) ["taxonomy"]=> string(11) "custom_tags" ["description"]=> string(0) "" ["parent"]=> int(0) ["count"]=> int(7) ["filter"]=> string(3) "raw" ["term_order"]=> string(1) "0" }

Content to: research

Your Smart Thermostat Data Could Tackle One of Earth’s Biggest Emissions Sources

Data privacy is on everyone's lips. But, could your smart thermostat data help to reduce emissions from the building sector's enormous footprint?

Starting From Scratch: ‘Basic Data’ Buoy Bolsters Marine Protection

Successful marine protection needs a strong scientific basis of sound data. A floating measuring station in the Baltic Sea aims to provide this.

New Breakthroughs in Nuclear Fusion Announced – But How Clean is Fusion Power?

Fusion power is still some decades away, although new strides are being made every year. But how does fusion compare with nuclear fission power?

Can Video Game Hardware Help Climate Research? Nvidia to Create a ‘Digital Twin’ of Earth

The manufacturer of graphics cards wants to use its know-how to recreate the Earth in a digital simulation on a unprecedented one metre scale.

Zooniverse: A Million Volunteers are Helping to Spot Animals, Transcribe Records and Weather Watch in the Name of Science

Whether its counting penguins, deciphering historical records or listening to the stars, Zooniverse harnesses people power to assist in breaking down the big data behind scientific research.

Interview: How Can We Make AI More Environmentally Friendly?

What potential does artificial intelligence have to help us protect the environment and tackle climate change? And with all the computing power it requires, how can we make artificial intelligence itself more environmentally friendly? What can companies, developers and governments do to ensure AI helps us protect - and not destroy - the environment? We put our questions to Stephan Richter from the Institute for Innovation and Technology.

The BCFN YES! Research Grant Competition Deadline Is Fast Approaching

The Barilla Center for Food and Nutrition's YES! competition is for PhD and post doc researchers who are looking to improve the sustainability of our food system. A 20,000 EUR research grant for one year is up for grabs.