You are currently viewing Hectares BC

Hectares BC

Hectares BC was a system for analyzing the environment in British Columbia through collaboration.

It was a really cool pilot project that’s testing out a new tool for analyzing geospatial data in the natural resource area. You know how the provincial government has a ton of this data, but not enough people or tools to analyze it all?

Well, Hectares BC solved that problem by allowing scientists, researchers, government workers, and anyone else to easily access and work with geographic information – no need for specialized GIS skills!

This makes planning, assessment, reporting, and decision making functions a breeze.

When it comes to planning, assessing, reporting, and making decisions about BC’s environmental and natural resources, scientists, researchers, environmental groups, government agencies, and others typically use standard GIS projects. This involves collecting and processing multiple layers of data, overlaying them, summarizing variables of interest for specific areas, and finally, writing a report based on the summary numbers.

Unfortunately, the first two steps of collecting and processing data often end up being redundant, as different groups end up doing the same work repeatedly. And to make matters worse, whenever the summary areas are changed, the entire process often needs to be repeated from scratch.

PostgreSQL was used as the database engine. It held 100 million records of hectare-by-hectare land information.

 

Geospatial data analysis

Geospatial data analysis is a process of understanding and interpreting information related to the Earth’s surface by analyzing data that is tied to specific geographic locations. In simpler terms, it’s about studying data that has a spatial component, like addresses, latitude and longitude coordinates, or any other form of location information.

Here’s a friendly breakdown of some key concepts and steps:

  1. Geospatial data: This is the information that has a geographic component, such as location, distance, or area. Examples include the positions of cities on a map, the path of a river, or the boundaries of a country.
  2. Data sources: Geospatial data can be collected from various sources like satellite images, GPS devices, or even social media check-ins. Some common data formats used in geospatial analysis are Shapefiles, GeoJSON, and KML files.
  3. Tools and software: To work with geospatial data, you will need specialized tools and software like QGIS, ArcGIS, or Google Earth. These tools allow you to visualize, analyze, and manipulate the data to extract meaningful insights.
  4. Visualization: This is the process of displaying geospatial data on a map or in a chart to make it easier to understand. Common visualizations include heat maps, choropleth maps, and 3D maps.
  5. Analysis: Geospatial data analysis involves various techniques to study patterns, relationships, and trends in the data. Some examples of analysis techniques are:
  • Buffer analysis: Helps you identify areas within a certain distance of a specific location.
  • Spatial autocorrelation: Measures the degree to which similar values are clustered together in space.
  • Overlay analysis: Combines multiple layers of data to study the relationships between them.
  • Network analysis: Analyzes the structure and connectivity of different points or features on a map, such as roads or public transit routes.
  1. Applications: Geospatial data analysis is widely used in many fields, such as urban planning, environmental studies, public health, disaster management, and transportation. By understanding spatial patterns and relationships, decision-makers can make more informed choices and develop better strategies.

 

I found myself deeply engrossed in the intricate world of geospatial data. It was a field that fascinated me, with its blend of technology and geography, and how it allowed us to capture, analyze, and visualize spatial information in ways that were both innovative and insightful.

I remember the first day I walked into the Geospatial Analysis lab. The room buzzed with the soft hum of high-powered computers, each boasting large dual monitors that displayed vibrant maps and complex datasets. I settled into my workstation, a dedicated space with a 27-inch screen that seemed like a portal to another dimension where the digital representation of the earth was waiting to be explored.

The core of my studies revolved around Geographic Information Systems (GIS), a technology that uses a variety of data sources to create detailed maps and models. I learned to work with various GIS software, but my mainstay became ArcGIS Pro, with its comprehensive suite of tools that allowed for intricate spatial analysis and cartographic design.

One of my projects involved analyzing satellite imagery with resolutions as sharp as 0.5 meters per pixel, which meant I could clearly identify features like roadways, vehicles, and even large trees. I spent hours meticulously classifying land use patterns, tracing the outlines of urban development, and measuring the impact of human activities on natural landscapes.

Another significant aspect of my studies was learning about remote sensing, which included understanding the electromagnetic spectrum and how different sensors, aboard satellites or drones, captured data beyond what the human eye could see. I delved into spectral bands, learning how each range—from visible light to thermal infrared—could reveal different characteristics of the earth’s surface.

As part of my thesis, I focused on LiDAR data, which uses laser pulses to measure distances to the earth’s surface and create high-resolution 3D models of the terrain. The datasets were enormous, often multiple gigabytes in size, and processing them required both patience and a powerful computer with a robust CPU and plenty of RAM to handle the computations.

Throughout my studies, I became proficient in data collection methods, such as GPS surveying, where I would venture out into the field with a handheld GPS unit capable of pinpointing locations with sub-meter accuracy. Back in the lab, I would overlay this ground-truthed data onto aerial imagery, ensuring the utmost precision in my analyses.