Vijay GadepallyA senior staff member at MIT Lincoln Laboratory, he directs a number of projects in the Lincoln Laboratory Supercomputing Center (LLSC) to make computing platforms and the artificial intelligence systems that run on them more efficient. Here, Gadepally discusses the growing use of generative ai in everyday tools, its hidden environmental impact, and some of the ways Lincoln Laboratory and the broader ai community can reduce emissions for a greener future.
Q: What trends are you seeing in terms of how generative ai is used in computing?
TO: Generative ai uses machine learning (ML) to create new content, such as images and text, based on data that is entered into the ML system. At LLSC we design and build some of the largest academic computing platforms in the world, and in recent years we've seen an explosion in the number of projects needing access to high-performance computing for generative ai. We are also seeing how generative ai is changing all kinds of fields and domains; For example, ChatGPT is already influencing the classroom and workplace faster than regulations seem to keep up.
We can imagine all kinds of uses for generative ai in the next decade, such as powering highly capable virtual assistants, developing new drugs and materials, and even improving our understanding of basic science. We can't predict what generative ai will be used for, but I can certainly say that with algorithms becoming more complex, its impact on computing, energy and climate will continue to grow very rapidly.
Q: What strategies is the LLSC using to mitigate this climate impact?
TO: We are always looking for ways to do <a target="_blank" href="https://www.ll.mit.edu/news/ai-models-are-devouring-energy-tools-reduce-consumption-are-here-if-data-centers-will-adopt”>more efficient computingDoing so helps our data center make the most of its resources and allows our scientific colleagues to advance their fields as efficiently as possible.
As an example, we've been reducing the amount of power our hardware consumes through simple changes, similar to dimming or turning off lights when you leave a room. In one experiment, we reduced the power consumption of a group of graphics processing units by 20 to 30 percent, with minimal impact on their performance, by applying a power cap. This technique also reduced hardware operating temperatures, making GPUs easier to cool and longer lasting.
Another strategy is to change our behavior to be more climate conscious. At home, some of us might choose to use renewable energy sources or smart scheduling. We are using similar techniques at the LLSC, such as training ai models when temperatures are colder or when local grid power demand is low.
We also realized that much of the energy spent on computing is often wasted, such as a water leak that increases your bill but without any benefit to your home. We developed some new techniques that allow us to monitor computing workloads as they run and then terminate those that are unlikely to produce good results. Surprisingly, in a series of cases We found that most calculations were able to finish early. without compromising the final result.
Q: What is an example of a project you have done that reduces the power output of a generative ai program?
TO: We recently built a climate-aware computer vision tool. Computer vision is a domain that focuses on applying ai to images; for example, differentiating between dogs and cats in an image, correctly labeling objects within an image, or searching for components of interest within an image.
In our tool, we include real-time carbon telemetry, which produces information on how much carbon our local grid emits while a model is running. Depending on this information, our system will automatically switch to a more energy-efficient version of the model, which typically has fewer parameters, at times of high carbon intensity, or to a much higher fidelity version of the model at times of low carbon intensity. .
By doing this, we saw an almost 80 percent reduction in carbon emissions for a period of one to two days. We recently extended this idea to other generative ai tasks, such as text summarization, and found the same results. Interestingly, performance sometimes improved after using our technique!
Q: What can we do as consumers of generative ai to help mitigate its climate impact?
TO: As consumers, we can ask our ai vendors to offer greater transparency. For example, on Google Flights I can see a variety of options that indicate the carbon footprint of a specific flight. We should get similar types of measurements from generative ai tools so we can make a conscious decision about which product or platform to use based on our priorities.
We can also make an effort to educate ourselves more about generative ai emissions in general. Many of us are familiar with vehicle emissions and it may be useful to talk about ai generative emissions in comparative terms. People may be surprised to learn, for example, that an image generation task is approximately equivalent to driving four miles in a gasoline car, or that it takes the same amount of energy to charge an electric car as it does to generate around 1,500 text summaries.
There are many cases where customers would be happy to make a concession if they knew its impact.
Q: What do you see for the future?
TO: Mitigating the climate impact of generative ai is one of those problems that people around the world are working on, and with a similar goal. We're doing a lot of work here at Lincoln Laboratory, but it's just scratching the surface. In the long term, data centers, ai developers, and energy networks will need to work together to conduct “energy audits” that uncover other unique ways we can improve computing efficiency. We need more partnerships and more collaboration to move forward.
If you are interested in learning more or collaborating with Lincoln Laboratory on these efforts, please contact Vijay Gadepally.