Sunday, March 31, 2024

A Python Device for Fetching Air Air pollution Knowledge from Google Maps Air High quality APIs | by Robert Martin-Brief | Oct, 2023

Must read


In August 2023, Google introduced the addition of an air high quality service to its listing of mapping APIs. You may learn extra about that right here. It seems this data is now additionally accessible from inside the Google Maps app, although the information obtainable by way of the APIs turned out to be a lot richer.

In accordance the announcement, Google is combining data from many sources at completely different resolutions — ground-based air pollution sensors, satellite tv for pc knowledge, reside visitors data and predictions from numerical fashions — to provide a dynamically up to date dataset of air high quality in 100 nations at as much as 500m decision. This appears like a really fascinating and doubtlessly helpful dataset for all kinds of mapping, healthcare and planning functions!

When first studying about this I used to be planning to attempt it out in a “discuss to your knowledge” utility, utilizing a few of the issues discovered from constructing this journey mapper software. Possibly a system that may plot a time sequence of air air pollution concentrations in your favourite metropolis, or maybe a software to assist individuals plan hikes of their native space as to keep away from unhealthy air?

There are three API instruments that may assist right here — a “present circumstances” service, which supplies present air high quality index values and pollutant concentrations at a given location; a “historic circumstances” service, which does the identical however at hourly intervals for as much as 30 days up to now and a “heatmap” service, which supplies present circumstances over a given space as a picture.

Beforehand, I had used the wonderful googlemapspackage deal to name Google Maps APIs in Python, however these new APIs will not be but supported. Surprisingly, past the official documentation I might discover few examples of individuals utilizing these new instruments and no pre-existing Python packages designed to name them. I’d be fortunately corrected although if somebody is aware of in any other case!

I subsequently constructed some fast instruments of my very own, and on this put up we stroll by how they work and how you can use them. I hope this shall be helpful to anybody eager to experiment with these new APIs in Python and in search of a spot to start out. All of the code for this mission might be discovered right here, and I’ll probably be increasing this repo over time as I add extra performance and construct some kind of mapping utility with the air high quality knowledge.

Let’s get began! On this part we’ll go over how you can fetch air high quality knowledge at a given location with Google Maps. You’ll first want an API key, which you’ll generate by way of your Google Cloud account. They’ve a 90-day free trial interval, after which you’ll pay for API companies you utilize. Be sure to allow the “Air High quality API”, and concentrate on the pricing insurance policies earlier than you begin making a variety of calls!

Screenshot of the Google Cloud API library, from the place you possibly can activate the air high quality API. Picture generated by the writer.

I normally retailer my API key in an .env file and cargo it with dotenv utilizing a perform like this

from dotenv import load_dotenv
from pathlib import Path

def load_secets():
load_dotenv()
env_path = Path(".") / ".env"
load_dotenv(dotenv_path=env_path)

google_maps_key = os.getenv("GOOGLE_MAPS_API_KEY")

return {
"GOOGLE_MAPS_API_KEY": google_maps_key,
}

Getting present circumstances requires a POST request as detailed right here. We’re going to take inspiration from the googlemaps package deal to do that in a means that may be generalized. First, we construct a consumer class that makes use of requests to make the decision. The aim is sort of simple — we need to construct a URL just like the one under, and embody all of the request choices particular to the consumer’s question.

https://airquality.googleapis.com/v1/currentConditions:lookup?key=YOUR_API_KEY

The Shopperclass takes in our API key as key after which builds the request_url for the question. It accepts request choices as a params dictionary after which places them within the JSON request physique, which is dealt with by the self.session.put up() name.

import requests
import io

class Shopper(object):
DEFAULT_BASE_URL = "https://airquality.googleapis.com"

def __init__(self, key):
self.session = requests.Session()
self.key = key

def request_post(self, url, params):
request_url = self.compose_url(url)
request_header = self.compose_header()
request_body = params

response = self.session.put up(
request_url,
headers=request_header,
json=request_body,
)

return self.get_body(response)

def compose_url(self, path):
return self.DEFAULT_BASE_URL + path + "?" + "key=" + self.key

@staticmethod
def get_body(response):
physique = response.json()

if "error" in physique:
return physique["error"]

return physique

@staticmethod
def compose_header():
return {
"Content material-Sort": "utility/json",
}

Now we will make a perform that helps the consumer assemble legitimate request choices for the present circumstances API after which makes use of this Shopper class to make the request. Once more, that is impressed by the design of the googlemaps package deal.

def current_conditions(
consumer,
location,
include_local_AQI=True,
include_health_suggestion=False,
include_all_pollutants=True,
include_additional_pollutant_info=False,
include_dominent_pollutant_conc=True,
language=None,
):
"""
See documentation for this API right here
https://builders.google.com/maps/documentation/air-quality/reference/relaxation/v1/currentConditions/lookup
"""
params = {}

if isinstance(location, dict):
params["location"] = location
else:
increase ValueError(
"Location argument have to be a dictionary containing latitude and longitude"
)

extra_computations = []
if include_local_AQI:
extra_computations.append("LOCAL_AQI")

if include_health_suggestion:
extra_computations.append("HEALTH_RECOMMENDATIONS")

if include_additional_pollutant_info:
extra_computations.append("POLLUTANT_ADDITIONAL_INFO")

if include_all_pollutants:
extra_computations.append("POLLUTANT_CONCENTRATION")

if include_dominent_pollutant_conc:
extra_computations.append("DOMINANT_POLLUTANT_CONCENTRATION")

if language:
params["language"] = language

params["extraComputations"] = extra_computations

return consumer.request_post("/v1/currentConditions:lookup", params)

The choices for this API are comparatively simple. It wants a dictionary with the longitude and latitude of the purpose you need to examine, and might optionally absorb varied different arguments that management how a lot data is returned. Lets see it in motion with all of the arguments set to True

# arrange consumer
consumer = Shopper(key=GOOGLE_MAPS_API_KEY)
# a location in Los Angeles, CA
location = {"longitude":-118.3,"latitude":34.1}
# a JSON response
current_conditions_data = current_conditions(
consumer,
location,
include_health_suggestion=True,
include_additional_pollutant_info=True
)

A whole lot of fascinating data is returned! Not solely do we now have the air high quality index values from the Common and US-based AQI indices, however we even have concentrations of the most important pollution, an outline of every one and an total set of well being suggestions for the present air high quality.

{'dateTime': '2023-10-12T05:00:00Z',
'regionCode': 'us',
'indexes': [{'code': 'uaqi',
'displayName': 'Universal AQI',
'aqi': 60,
'aqiDisplay': '60',
'color': {'red': 0.75686276, 'green': 0.90588236, 'blue': 0.09803922},
'category': 'Good air quality',
'dominantPollutant': 'pm10'},
{'code': 'usa_epa',
'displayName': 'AQI (US)',
'aqi': 39,
'aqiDisplay': '39',
'color': {'green': 0.89411765},
'category': 'Good air quality',
'dominantPollutant': 'pm10'}],
'pollution': [{'code': 'co',
'displayName': 'CO',
'fullName': 'Carbon monoxide',
'concentration': {'value': 292.61, 'units': 'PARTS_PER_BILLION'},
'additionalInfo': {'sources': 'Typically originates from incomplete combustion of carbon fuels, such as that which occurs in car engines and power plants.',
'effects': 'When inhaled, carbon monoxide can prevent the blood from carrying oxygen. Exposure may cause dizziness, nausea and headaches. Exposure to extreme concentrations can lead to loss of consciousness.'}},
{'code': 'no2',
'displayName': 'NO2',
'fullName': 'Nitrogen dioxide',
'concentration': {'value': 22.3, 'units': 'PARTS_PER_BILLION'},
'additionalInfo': {'sources': 'Main sources are fuel burning processes, such as those used in industry and transportation.',
'effects': 'Exposure may cause increased bronchial reactivity in patients with asthma, lung function decline in patients with Chronic Obstructive Pulmonary Disease (COPD), and increased risk of respiratory infections, especially in young children.'}},
{'code': 'o3',
'displayName': 'O3',
'fullName': 'Ozone',
'concentration': {'value': 24.17, 'units': 'PARTS_PER_BILLION'},
'additionalInfo': {'sources': 'Ozone is created in a chemical reaction between atmospheric oxygen, nitrogen oxides, carbon monoxide and organic compounds, in the presence of sunlight.',
'effects': 'Ozone can irritate the airways and cause coughing, a burning sensation, wheezing and shortness of breath. Additionally, ozone is one of the major components of photochemical smog.'}},
{'code': 'pm10',
'displayName': 'PM10',
'fullName': 'Inhalable particulate matter (<10µm)',
'concentration': {'value': 44.48, 'units': 'MICROGRAMS_PER_CUBIC_METER'},
'additionalInfo': {'sources': 'Main sources are combustion processes (e.g. indoor heating, wildfires), mechanical processes (e.g. construction, mineral dust, agriculture) and biological particles (e.g. pollen, bacteria, mold).',
'effects': 'Inhalable particles can penetrate into the lungs. Short term exposure can cause irritation of the airways, coughing, and aggravation of heart and lung diseases, expressed as difficulty breathing, heart attacks and even premature death.'}},
{'code': 'pm25',
'displayName': 'PM2.5',
'fullName': 'Fine particulate matter (<2.5µm)',
'concentration': {'value': 11.38, 'units': 'MICROGRAMS_PER_CUBIC_METER'},
'additionalInfo': {'sources': 'Main sources are combustion processes (e.g. power plants, indoor heating, car exhausts, wildfires), mechanical processes (e.g. construction, mineral dust) and biological particles (e.g. bacteria, viruses).',
'effects': 'Fine particles can penetrate into the lungs and bloodstream. Short term exposure can cause irritation of the airways, coughing and aggravation of heart and lung diseases, expressed as difficulty breathing, heart attacks and even premature death.'}},
{'code': 'so2',
'displayName': 'SO2',
'fullName': 'Sulfur dioxide',
'concentration': {'value': 0, 'units': 'PARTS_PER_BILLION'},
'additionalInfo': {'sources': 'Main sources are burning processes of sulfur-containing fuel in industry, transportation and power plants.',
'effects': 'Exposure causes irritation of the respiratory tract, coughing and generates local inflammatory reactions. These in turn, may cause aggravation of lung diseases, even with short term exposure.'}}],
'healthRecommendations': {'generalPopulation': 'With this stage of air high quality, you don't have any limitations. Benefit from the open air!',
'aged': 'When you begin to really feel respiratory discomfort equivalent to coughing or respiration difficulties, take into account decreasing the depth of your outside actions. Attempt to restrict the time you spend close to busy roads, development websites, open fires and different sources of smoke.',
'lungDiseasePopulation': 'When you begin to really feel respiratory discomfort equivalent to coughing or respiration difficulties, take into account decreasing the depth of your outside actions. Attempt to restrict the time you spend close to busy roads, industrial emission stacks, open fires and different sources of smoke.',
'heartDiseasePopulation': 'When you begin to really feel respiratory discomfort equivalent to coughing or respiration difficulties, take into account decreasing the depth of your outside actions. Attempt to restrict the time you spend close to busy roads, development websites, industrial emission stacks, open fires and different sources of smoke.',
'athletes': 'When you begin to really feel respiratory discomfort equivalent to coughing or respiration difficulties, take into account decreasing the depth of your outside actions. Attempt to restrict the time you spend close to busy roads, development websites, industrial emission stacks, open fires and different sources of smoke.',
'pregnantWomen': 'To maintain you and your child wholesome, take into account decreasing the depth of your outside actions. Attempt to restrict the time you spend close to busy roads, development websites, open fires and different sources of smoke.',
'youngsters': 'When you begin to really feel respiratory discomfort equivalent to coughing or respiration difficulties, take into account decreasing the depth of your outside actions. Attempt to restrict the time you spend close to busy roads, development websites, open fires and different sources of smoke.'}}

Wouldn’t or not it’s good to have the ability to fetch a timeseries of those AQI and pollutant values for a given location? Which may reveal fascinating patterns equivalent to correlations between the pollution or every day fluctuations attributable to visitors or weather-related elements.

We will do that with one other POST request to the historic circumstances API, which is able to give us an hourly historical past. This works in a lot the identical means as present circumstances, the one main distinction being that for the reason that outcomes might be fairly lengthy they’re returned as a number of pages , which requires a bit additional logic to deal with.

Let’s modify the request_post methodology of Shopper to deal with this.

  def request_post(self,url,params):

request_url = self.compose_url(url)
request_header = self.compose_header()
request_body = params

response = self.session.put up(
request_url,
headers=request_header,
json=request_body,
)

response_body = self.get_body(response)

# put the primary web page within the response dictionary
web page = 1
final_response = {
"page_{}".format(web page) : response_body
}
# fetch all of the pages if wanted
whereas "nextPageToken" in response_body:
# name once more with the subsequent web page's token
request_body.replace({
"pageToken":response_body["nextPageToken"]
})
response = self.session.put up(
request_url,
headers=request_header,
json=request_body,
)
response_body = self.get_body(response)
web page += 1
final_response["page_{}".format(page)] = response_body

return final_response

This handles the case the place response_body accommodates a discipline referred to as nextPageToken, which is the id of the subsequent web page of knowledge that’s been generated and is able to fetch. The place that data exists, we simply must name the API once more with a brand new param referred to as pageToken , which directs it to the related web page. We do that repeatedly shortly loop till there aren’t any extra pages left. Our final_response dictionary subsequently now accommodates one other layer denoted by web page quantity. For calls to current_conditions there’ll solely ever be one web page, however for calls to historical_conditions there could also be a number of.

With that taken care of, we will write a historical_conditions perform in a really comparable fashion to current_conditions .

def historical_conditions(
consumer,
location,
specific_time=None,
lag_time=None,
specific_period=None,
include_local_AQI=True,
include_health_suggestion=False,
include_all_pollutants=True,
include_additional_pollutant_info=False,
include_dominant_pollutant_conc=True,
language=None,
):
"""
See documentation for this API right here https://builders.google.com/maps/documentation/air-quality/reference/relaxation/v1/historical past/lookup
"""
params = {}

if isinstance(location, dict):
params["location"] = location
else:
increase ValueError(
"Location argument have to be a dictionary containing latitude and longitude"
)

if isinstance(specific_period, dict) and never specific_time and never lag_time:
assert "startTime" in specific_period
assert "endTime" in specific_period

params["period"] = specific_period

elif specific_time and never lag_time and never isinstance(specific_period, dict):
# word that point have to be within the "Zulu" format
# e.g. datetime.datetime.strftime(datetime.datetime.now(),"%Y-%m-%dTpercentH:%M:%SZ")
params["dateTime"] = specific_time

# lag intervals in hours
elif lag_time and never specific_time and never isinstance(specific_period, dict):
params["hours"] = lag_time

else:
increase ValueError(
"Should present specific_time, specific_period or lag_time arguments"
)

extra_computations = []
if include_local_AQI:
extra_computations.append("LOCAL_AQI")

if include_health_suggestion:
extra_computations.append("HEALTH_RECOMMENDATIONS")

if include_additional_pollutant_info:
extra_computations.append("POLLUTANT_ADDITIONAL_INFO")

if include_all_pollutants:
extra_computations.append("POLLUTANT_CONCENTRATION")

if include_dominant_pollutant_conc:
extra_computations.append("DOMINANT_POLLUTANT_CONCENTRATION")

if language:
params["language"] = language

params["extraComputations"] = extra_computations
# web page dimension default set to 100 right here
params["pageSize"] = 100
# web page token will get crammed in if wanted by the request_post methodology
params["pageToken"] = ""

return consumer.request_post("/v1/historical past:lookup", params)

To outline the historic interval, the API can settle for a lag_time in hours, as much as 720 (30 days). It could actually additionally settle for a specific_perioddictionary, with defines begin and finish instances within the format described within the feedback above. Lastly, to fetch a single hour of knowledge, it may well settle for only one timestamp, offered by specific_time . Additionally word the usage of the pageSize parameter, which controls what number of time factors are returned in every name to the API. The default right here is 100.

Let’s attempt it out.

# arrange consumer
consumer = Shopper(key=GOOGLE_MAPS_API_KEY)
# a location in Los Angeles, CA
location = {"longitude":-118.3,"latitude":34.1}
# a JSON response
history_conditions_data = historical_conditions(
consumer,
location,
lag_time=720
)

We should always get an extended, nested JSON response that accommodates the AQI index values and particular pollutant values at 1 hour increments during the last 720 hours. There are a lot of methods to format this right into a construction that’s extra amenable to visualization and evaluation, and the perform under will convert it right into a pandas dataframe in “lengthy” format, which works effectively with seabornfor plotting.

from itertools import chain
import pandas as pd

def historical_conditions_to_df(response_dict):

chained_pages = listing(chain(*[response_dict[p]["hoursInfo"] for p in [*response_dict]]))

all_indexes = []
all_pollutants = []
for i in vary(len(chained_pages)):
# want this examine in case one of many timestamps is lacking knowledge, which may typically occur
if "indexes" in chained_pages[i]:
this_element = chained_pages[i]
# fetch the time
time = this_element["dateTime"]
# fetch all of the index values and add metadata
all_indexes += [(time , x["code"],x["displayName"],"index",x["aqi"],None) for x in this_element['indexes']]
# fetch all of the pollutant values and add metadata
all_pollutants += [(time , x["code"],x["fullName"],"pollutant",x["concentration"]["value"],x["concentration"]["units"]) for x in this_element['pollutants']]

all_results = all_indexes + all_pollutants
# generate "lengthy format" dataframe
res = pd.DataFrame(all_results,columns=["time","code","name","type","value","unit"])
res["time"]=pd.to_datetime(res["time"])
return res

Operating this on the output of historical_conditions will produce a dataframe whose columns are formatted for simple evaluation.

df = historical_conditions_to_df(history_conditions_data)
Instance dataframe of historic AQI knowledge, prepared for plotting

And we will now plot the lead to seaborn or another visualization software.

import seaborn as sns
g = sns.relplot(
x="time",
y="worth",
knowledge=df[df["code"].isin(["uaqi","usa_epa","pm25","pm10"])],
variety="line",
col="title",
col_wrap=4,
hue="sort",
top=4,
facet_kws={'sharey': False, 'sharex': False}
)
g.set_xticklabels(rotation=90)
Common AQI, US AQI, pm25 and pm10 values for this location in LA over a 30 day interval. Picture generated the the writer.

That is already very fascinating! There are clearly a number of periodicities within the pollutant time sequence and it’s notable that the US AQI is intently correlated with the pm25 and pm10 concentrations, as anticipated. I’m a lot much less aware of the Common AQI that Google is offering right here, so can’t clarify why seems anti-correlated with pm25 and p10. Does smaller UAQI imply higher air high quality? Regardless of some looking round I’ve been unable to discover a good reply.

Now for the ultimate use case of the Google Maps Air High quality API — producing heatmap tiles. The documentation about this a sparse, which is a disgrace as a result of these tiles are a strong software for visualizing present air high quality, particularly when mixed with a Folium map.

We fetch them with a GET request, which entails constructing a URL within the following format, the place the placement of the tile is specified by zoom , x and y

GET https://airquality.googleapis.com/v1/mapTypes/{mapType}/heatmapTiles/{zoom}/{x}/{y}

What dozoom , x and y imply? We will answe this by studying about how Google Maps converts coordinates in latitude and longitude into “tile coordinates”, which is described intimately right here. Primarily, Google Maps is storing imagery in grids the place every cell measures 256 x 256 pixels and the real-world dimensions of the cell are a perform of the zoom stage. After we make a name to the API, we have to specify which grid to attract from — which is set by the zoom stage — and the place on the grid to attract from — which is set by the x and y tile coordinates. What comes again is a bytes array that may be learn by Python Imaging Library (PIL) or similiar imaging processing package deal.

Having shaped our url within the above format, we will add just a few strategies to the Shopper class that may enable us to fetch the corresponding picture.

  def request_get(self,url):

request_url = self.compose_url(url)
response = self.session.get(request_url)

# for photos coming from the heatmap tiles service
return self.get_image(response)

@staticmethod
def get_image(response):

if response.status_code == 200:
image_content = response.content material
# word use of Picture from PIL right here
# wants from PIL import Picture
picture = Picture.open(io.BytesIO(image_content))
return picture
else:
print("GET request for picture returned an error")
return None

That is good, however we what we actually want is the power to transform a set of coordinates in longitude and latitude into tile coordinates. The documentation explains how — we first convert to coordinates into the Mercator projection, from which we convert to “pixel coordinates” utilizing the required zoom stage. Lastly we translate that into the tile coordinates. To deal with all these transformations, we will use the TileHelper class under.

import math
import numpy as np

class TileHelper(object):

def __init__(self, tile_size=256):

self.tile_size = tile_size

def location_to_tile_xy(self,location,zoom_level=4):

# Based mostly on perform right here
# https://builders.google.com/maps/documentation/javascript/examples/map-coordinates#maps_map_coordinates-javascript

lat = location["latitude"]
lon = location["longitude"]

world_coordinate = self._project(lat,lon)
scale = 1 << zoom_level

pixel_coord = (math.flooring(world_coordinate[0]*scale), math.flooring(world_coordinate[1]*scale))
tile_coord = (math.flooring(world_coordinate[0]*scale/self.tile_size),math.flooring(world_coordinate[1]*scale/self.tile_size))

return world_coordinate, pixel_coord, tile_coord

def tile_to_bounding_box(self,tx,ty,zoom_level):

# see https://builders.google.com/maps/documentation/javascript/coordinates
# for particulars
box_north = self._tiletolat(ty,zoom_level)
# tile numbers advance in direction of the south
box_south = self._tiletolat(ty+1,zoom_level)
box_west = self._tiletolon(tx,zoom_level)
# time numbers advance in direction of the east
box_east = self._tiletolon(tx+1,zoom_level)

# (latmin, latmax, lonmin, lonmax)
return (box_south, box_north, box_west, box_east)

@staticmethod
def _tiletolon(x,zoom):
return x / math.pow(2.0,zoom) * 360.0 - 180.0

@staticmethod
def _tiletolat(y,zoom):
n = math.pi - (2.0 * math.pi * y)/math.pow(2.0,zoom)
return math.atan(math.sinh(n))*(180.0/math.pi)

def _project(self,lat,lon):

siny = math.sin(lat*math.pi/180.0)
siny = min(max(siny,-0.9999), 0.9999)

return (self.tile_size*(0.5 + lon/360), self.tile_size*(0.5 - math.log((1 + siny) / (1 - siny)) / (4 * math.pi)))

@staticmethod
def find_nearest_corner(location,bounds):

corner_lat_idx = np.argmin([
np.abs(bounds[0]-location["latitude"]),
np.abs(bounds[1]-location["latitude"])
])

corner_lon_idx = np.argmin([
np.abs(bounds[2]-location["longitude"]),
np.abs(bounds[3]-location["longitude"])
])

if (corner_lat_idx == 0) and (corner_lon_idx == 0):
# closests is latmin, lonmin
course = "southwest"
elif (corner_lat_idx == 0) and (corner_lon_idx == 1):
course = "southeast"
elif (corner_lat_idx == 1) and (corner_lon_idx == 0):
course = "northwest"
else:
course = "northeast"

corner_coords = (bounds[corner_lat_idx],bounds[corner_lon_idx+2])
return corner_coords, course

@staticmethod
def get_ajoining_tiles(tx,ty,course):

if course == "southwest":
return [(tx-1,ty),(tx-1,ty+1),(tx,ty+1)]
elif course == "southeast":
return [(tx+1,ty),(tx+1,ty-1),(tx,ty-1)]
elif course == "northwest":
return [(tx-1,ty-1),(tx-1,ty),(tx,ty-1)]
else:
return [(tx+1,ty-1),(tx+1,ty),(tx,ty-1)]

We will see that location_to_tile_xy is taking in a location dictionary and zoom stage and returning the tile by which that time might be discovered. One other useful perform is tile_to_bounding_box , which is able to discover the bounding coordinates of a specified grid cell. We’d like this if we’re going to geolocate the cell and plot it on a map.

Lets see how this works contained in the air_quality_tile perform under, which goes to absorb our consumer , location and a string indicating what sort of tile we need to fetch. We additionally must specify a zoom stage, which might be troublesome to decide on at first and requires some trial and error. We’ll focus on the get_adjoining_tiles argument shortly.

def air_quality_tile(
consumer,
location,
pollutant="UAQI_INDIGO_PERSIAN",
zoom=4,
get_adjoining_tiles = True

):

# see https://builders.google.com/maps/documentation/air-quality/reference/relaxation/v1/mapTypes.heatmapTiles/lookupHeatmapTile

assert pollutant in [
"UAQI_INDIGO_PERSIAN",
"UAQI_RED_GREEN",
"PM25_INDIGO_PERSIAN",
"GBR_DEFRA",
"DEU_UBA",
"CAN_EC",
"FRA_ATMO",
"US_AQI"
]

# accommodates helpful strategies for dealing the tile coordinates
helper = TileHelper()

# get the tile that the placement is in
world_coordinate, pixel_coord, tile_coord = helper.location_to_tile_xy(location,zoom_level=zoom)

# get the bounding field of the tile
bounding_box = helper.tile_to_bounding_box(tx=tile_coord[0],ty=tile_coord[1],zoom_level=zoom)

if get_adjoining_tiles:
nearest_corner, nearest_corner_direction = helper.find_nearest_corner(location, bounding_box)
adjoining_tiles = helper.get_ajoining_tiles(tile_coord[0],tile_coord[1],nearest_corner_direction)
else:
adjoining_tiles = []

tiles = []
#get all of the adjoining tiles, plus the one in query
for tile in adjoining_tiles + [tile_coord]:

bounding_box = helper.tile_to_bounding_box(tx=tile[0],ty=tile[1],zoom_level=zoom)
image_response = consumer.request_get(
"/v1/mapTypes/" + pollutant + "/heatmapTiles/" + str(zoom) + '/' + str(tile[0]) + '/' + str(tile[1])
)

# convert the PIL picture to numpy
attempt:
image_response = np.array(image_response)
besides:
image_response = None

tiles.append({
"bounds":bounding_box,
"picture":image_response
})

return tiles

From studying the code, we will see that the workflow is as follows: First, discover the tile coordinates of the placement of curiosity. This specifies the grid cell we need to fetch. Then, discover the bounding coordinates of this grid cell. If we need to fetch the encompassing tiles, discover the closest nook of the bounding field after which use that to calculate the tile coordinates of the three adjoining grid cells. Then name the API and return every of the tiles as a picture with its corresponding bounding field.

We will run this in the usual means, as follows:

consumer = Shopper(key=GOOGLE_MAPS_API_KEY)
location = {"longitude":-118.3,"latitude":34.1}
zoom = 7
tiles = air_quality_tile(
consumer,
location,
pollutant="UAQI_INDIGO_PERSIAN",
zoom=zoom,
get_adjoining_tiles=False)

After which plot with folium for a zoomable map! Be aware that I’m utilizing leafmap right here, as a result of this package deal can generate Folium maps which are appropriate with gradio, a strong software for producing easy consumer interfaces for python functions. Check out this text for an instance.

import leafmap.foliumap as leafmap
import folium

lat = location["latitude"]
lon = location["longitude"]

map = leafmap.Map(location=[lat, lon], tiles="OpenStreetMap", zoom_start=zoom)

for tile in tiles:
latmin, latmax, lonmin, lonmax = tile["bounds"]
AQ_image = tile["image"]
folium.raster_layers.ImageOverlay(
picture=AQ_image,
bounds=[[latmin, lonmin], [latmax, lonmax]],
opacity=0.7
).add_to(map)

Maybe disappointingly, the tile containing our location at this zoom stage is usually sea, though its nonetheless good to see the air air pollution plotted on prime of an in depth map. When you zoom in, you possibly can see that highway visitors data is getting used to tell the air high quality alerts in city areas.

Plotting an air high quality heatmap tile on prime of a Folium map. Picture generated by the writer.

Setting get_adjoining_tiles=True provides us a a lot nicer map as a result of it fetches the three closest, non-overlapping tiles at that zoom stage. In our case that helps lots to make the map extra presentable.

After we additionally fetch the adjoining tiles, a way more fascinating result’s produced. Be aware that the colours right here present the Common AQI index. Picture generated by the writer.

I personally choose the pictures generated when pollutant=US_AQI, however there are a number of completely different choices. Sadly the API doesn’t return a shade scale, though it will be attainable to generate one utilizing the pixel values within the picture and data of what the colours imply.

The identical tiles as above coloured in response to the US AQI. This map was generated on 10/12/2023 and the intense crimson spot in central CA seems to be prescribed fireplace within the hills close to Coalinga, in response to this software https://www.frontlinewildfire.com/california-wildfire-map/. Picture generated by the writer.

Thanks for making it to the top! Right here we explored how you can use the Google Maps Air High quality APIs to ship leads to Python, which could possibly be utilized in method of fascinating functions. In future I hope to comply with up with one other article concerning the air_quality_mapper software because it evolves additional, however I hope that the scripts mentioned right here shall be helpful in their very own proper. As all the time, any solutions for additional growth could be a lot appreciated!



Supply hyperlink

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article