In this tutorial, we will build an interactive scratch project on Google Colab! This guide will guide it through the extraction of live time forecast data from the United States National Meteorological Service. You will learn to configure your environment, write a Python script using beautifulsoup and applications, and integrate an interactive user interface with IPywidgets. This tutorial provides a step -by -step approach to collect, show and save weather data, all within a single autonomous collaboration notebook.
!pip install beautifulsoup4 ipywidgets pandas
First, we install three essential libraries: Beautifulsoup4 to analyze HTML content, ipywidgets to create interactive elements and pandas for manipulation and data analysis. Execute it in your Colab notebook ensures that your environment is fully prepared for the web scraping project.
import requests
from bs4 import BeautifulSoup
import csv
from google.colab import files
import ipywidgets as widgets
from IPython.display import display, clear_output, FileLink
import pandas as pd
We import all the libraries necessary to build an interactive web scraping project in Colab. Includes requests to handle HTTP requests, BS4 Beautifulup to analyze HTML and CSV to manage CSV file operations. In addition, it brings Google.colab files for file downloads, IPywidgets and IPython viewing tools to create an interactive user interface and handling pandas for handling and data screen.
def scrape_weather():
"""
Scrapes weather forecast data for San Francisco from the National Weather Service.
Returns a list of dictionaries containing the period, short description, and temperature.
"""
url="https://forecast.weather.gov/MapClick.php?lat=37.7772&lon=-122.4168"
print("Scraping weather data from:", url)
response = requests.get(url)
if response.status_code != 200:
print("Error fetching page:", url)
return None
soup = BeautifulSoup(response.text, 'html.parser')
seven_day = soup.find(id="seven-day-forecast")
forecast_items = seven_day.find_all(class_="tombstone-container")
weather_data = ()
for forecast in forecast_items:
period = forecast.find(class_="period-name").get_text() if forecast.find(class_="period-name") else ''
short_desc = forecast.find(class_="short-desc").get_text() if forecast.find(class_="short-desc") else ''
temp = forecast.find(class_="temp").get_text() if forecast.find(class_="temp") else ''
weather_data.append({
"period": period,
"short_desc": short_desc,
"temp": temp
})
print(f"Scraped {len(weather_data)} forecast entries.")
return weather_data
With the previous function, we recover the weather forecast for San Francisco of the National Meteorological Service. He makes an HTTP application to the forecast page, analyzes the HTML with Beautifuluup and extracts details such as the forecast period, the description and temperature of each entry. The collected data is stored as a list of dictionaries and returned.
def save_to_csv(data, filename="weather.csv"):
"""
Saves the provided data (a list of dictionaries) to a CSV file.
"""
with open(filename, "w", newline="", encoding='utf-8') as f:
writer = csv.DictWriter(f, fieldnames=("period", "short_desc", "temp"))
writer.writeheader()
writer.writerows(data)
print(f"Data saved to {filename}")
return filename
Now, this function takes the meteorological data scraped from a list of dictionaries and writes them in a CSV file using the Python CSV module. Open the writing mode file with the UTF-8 coding, initializes a writer dictated with predefined field (“period”, “short_desc” and “temp”), write the header row and then write all the rows of data.
out = widgets.Output()
def on_button_click(b):
"""
Callback function that gets executed when the "Scrape Weather Data" button is clicked.
It scrapes the weather data, saves it to CSV, displays the data in a table,
and shows a download link for the CSV file.
"""
with out:
clear_output()
print("Starting weather data scrape...")
data = scrape_weather()
if data is None:
print("Failed to scrape weather data.")
return
csv_filename = save_to_csv(data)
df = pd.DataFrame(data)
print("\nWeather Forecast Data:")
display(df)
print("\nDownload CSV file:")
display(FileLink(csv_filename))
button = widgets.Button(description="Scrape Weather Data", button_style="success")
button.on_click(on_button_click)
display(button, out)
Finally, the last fragment establishes an interactive user interface in Colab using ipywidgets that, when activated, scratch meteorological data, show it in a table and provide a CSV download link. Efficiently combines the web scraping and user interaction in a compact notebook configuration.
Departure sample
In this tutorial, we demonstrate how to combine the web scraping with an interactive user interface in a Google Colab environment. We build a complete project that obtains meteorological data in real time, processes them using Beautifuluup and shows the results in an interactive table while offering a CSV download option.
Here is the Colab notebook For the previous project. Besides, don't forget to follow us <a target="_blank" href="https://x.com/intent/follow?screen_name=marktechpost” target=”_blank” rel=”noreferrer noopener”>twitter and join our Telegram channel and LINKEDIN GRsplash. Do not forget to join our 80k+ ml subject.
Recommended Reading Reading IA Research Liberations: An advanced system that integrates the ai system and data compliance standards to address legal concerns in IA data sets

Asif Razzaq is the CEO of Marktechpost Media Inc .. as a visionary entrepreneur and engineer, Asif undertakes to take advantage of the potential of artificial intelligence for the social good. Its most recent effort is the launch of an artificial intelligence media platform, Marktechpost, which stands out for its deep coverage of automatic learning and deep learning news that is technically solid and easily understandable by a broad audience. The platform has more than 2 million monthly views, illustrating its popularity among the public.