#Featured #People

Geospatial jobs of the week: MSF UK, BP, COWI are hiring

If your company is looking for new talent and you want to share the opportunity with our community, feel free to submit a job using the online form for us to review and include in our list! If you would like to know more about our Geospatial Job Portal, read about it here.

If you are enthusiastic about location data or anything geospatial, then this is the job portal for you!

Looking for more positions in GIS, academia, product, or data science roles? Go directly to our searchable Geospatial Job Portal!

Featured Jobs

University of Maryland: GIS Data/Support Specialist
📍 Maryland, United States

The GIS Data/Support Specialist will provide essential mapping and GIS application support (data and map service development and maintenance) to the maintenance of FM’s campus mapping with emphasis on the public campus web map and other widely used Campus Enterprise GIS services. The GIS Data Specialist reports directly to the Enterprise GIS Manager and will be responsible for coordinating enterprise GIS project tasks and resources and will assist with campus database and systems administration tasks. The position will also assist with on-demand requests for data, maps, and modifications to existing data services.

UCAR: High Altitude Observatory (HAO) Lab Director
📍 Boulder, Colorado, US

MSF UK: Health Information and eHealth Strategic Lead
📍 London, UK

COWI: Solution Architect for GIS & IT
📍 Lyngby, Denmark

BP: SIM Senior Specialist Geospatial
📍 Sunbury, UK

Cognizant: GIS Developer
📍 Tampa, FL, US

Even if these jobs may not be for you, they may help out someone in your network. Please share!

And if there are any specific things you’d like to see in our job portal, feel free to get in touch. Be sure to follow us on LinkedIn as well!

Say thanks for this article (0)
The community is supported by:
Become a sponsor
#Featured
#Contributing Writers #Featured #Fun #Ideas #People
Share Your Insights: Geoawesomeness is Looking for Contributing Writers
Nikita Marwaha Kraetzig 01.18.2024
AWESOME 3
#Business #Featured
Call for nominations: Global Top 100 Geospatial Companies of 2024
Avatar for Muthukumar Kumar
Muthukumar Kumar 10.16.2023
AWESOME 1
#Business #Featured
Global Top 100 Geospatial Companies – 2024 Edition
Avatar for Muthukumar Kumar
Muthukumar Kumar 01.31.2024
AWESOME 10
Next article
#Business #Featured

Self-driving cars that run on simple maps and video camera feeds?

Courtesy: Chelsea Turner

There are more than 60 companies in the United States alone that are developing technologies for self-driving cars. While some are rethinking maps at an unprecedented centimeter-level resolution, others are finding ways to make the sensor rig affordable to the consumer. A team of researchers at Massachusetts Institute of Technology (MIT), meanwhile, is trying to bypass all the fuss by getting autonomous vehicles to mimic human driving patterns – using only simple GPS maps and video camera feeds.

While human drivers can easily navigate in new, unfamiliar locations if they are armed with a basic map, the driverless cars being tested currently rely on computationally-intensive, meticulously-labeled maps. These maps, which have been created from LiDAR scans, are so massive that it takes 4,000 gigabytes of data to store just the city of San Francisco.

MIT says it can get the job done in much less. The maps which its system Variational End-to-End Navigation and Localization is using capture the whole world using only 40 gigabytes of data.

So, how does this thing work?

MIT’s system uses a machine learning model which is commonly used for image recognition. The model trains by observing a human driver steer in real life conditions. To explain the technology better, Daniela Rus, director of the Computer Science and Artificial Intelligence Laboratory (CSAIL) and co-author of the research paper, gives the example of how the system would react to a T-shaped intersection.

“Initially, at a T-shaped intersection, there are many different directions the car could turn,” Rus says. “The model starts by thinking about all those directions, but as it sees more and more data about what people do, it will see that some people turn left and some turn right, but nobody goes straight. Straight ahead is ruled out as a possible direction, and the model learns that, at T-shaped intersections, it can only move left or right.”

The researchers have tested the system with randomly chosen routes in Massachusetts and it has proven successful in recognizing distant stop signs or line breaks on the side of the road as signs of an upcoming intersection. Every time it intercepts an intersection, the system rummages through its steering command database to make a decision just like a human would.

Learn more about the technology the video below and tell us how successful do you think it would be in making self-driving cars a reality in the near future.

 

Read on
Search