source :

Creepy is a geolocation information aggregation tool. It allows users to gather already published and made publicly available geolocation information from a number of social networking platforms and image hosting services. In its current, beta form, it caters to harvesting geolocation information from twitter in the form of geotagged tweets, foursquare check-in’s, geotagged photos posted on twitter via image hosting services such as yfrog, twitpic,, plixi and others, and Flickr. The geolocation information that Creepy retrieves is presented to the user in the form of points on a navigable embedded map along with a list of locations (geographic latitude and longitude accompanied with the date and time). Each location is associated with contextual information, such as the text that the user tweeted from that specific location, or the photo he or she uploaded.

Use – Features

Creepy’s main functionality is divided in two tabs in the main interface. The “targets” tab and the “mapview” tab. Starting Creepy will bring you to the target interface


Before we can start using it, we should authorize Creepy to use our twitter account in order to access the twitter API. This process is necessary if we need to use the search function, to search for twitter users within Creepy, and to access twitter users we follow that have non public (protected) timelines. This can be done by navigating to Edit → Settings

Creepy Creepy


Clicking on “Authorize Creepy” the web browser will be invoked and pointed to twitter where providing the user credentials for our account, we will receive a PIN number. Inserting this PIN number back in Creepy and clicking “OK” the authorization process is completed. The authentication and authorization process makes use of oAuth protocol and user credentials are not shared with the application itself. The authorization can be revoked at any time by visiting the account settings on The next two tabs in settings can be used to specify a key for accessing Flickr API should this be necessary (Creepy comes with a predefined API key, there is no actual need to change this unless it is not working), and to specify the directory in our local disk where photos should be downloaded in order to be analyzed for geolocation related EXIF tags.





Now that we have authorized Creepy to use our twitter account and edited the settings to our preference, we are ready to use the application. If we do not know the twitter handle or the Flickr ID of the user we want to “target”, we can use the inbuilt search functions for the two services.



For twitter, we can search by username, full name and email. For Flickr we can search for full names or usernames, by selecting the respective search button. The search results include the username, the full name, a profile picture for the user and some additional information such as declared location, if applicable.

Double clicking on the correct result will populate the text fields on the top. Alternatively, we can provide the twitter username or Flickr ID manually in case we know that beforehand.


Once we are satisfied with our search queries, we hit “Geolocate Target”


The geolocation information retrieval can be a little lengthy process depending on the amount of tweets that the “target” has, how many of those tweets include a photo that needs to be downloaded and analyzed. It can take from 1-2 minutes, up to a quarter so please be patient. Once the retrieval and analysis is over, we are presented with the results in something that will look quite similar to the following:

On the left side we can see a list with all the retrieved locations in the form of latitude, longitude, timestamp triples and on the right side the embedded map with all the locations, focused initially to the most recent location retrieved. On the bottom side of the user interface, there is a text field that includes useful information about the retrieval process, such as how many tweets were analyzed, and information about possible errors that Creepy encountered while trying to access targeted services.

From here, we can navigate the map using the embedded controls or just by simply dragging the map around and have a detailed overview of specific geographic areas or zoom out and have a global overview of retrieved locations in wider areas


Alternatively, we can navigate through the results by using the location list on the left. Double clicking on any location from the list, will focus the map on that specific geographic location and will also present relative contextual information in the text area underneath the map.



As we can see, the geolocation information was retrieved from twitter, and we can also see what the “target” tweeted from that specific location along with a link to the actual tweet on twitter’s website. (Information is blurred out due to privacy concerns). Right-clicking on locations from the list, we have the opportunity to either copy the latitude-longitude tuple on the system’s clipboard or to open the location in our browser using google maps. This can be very handy in cases where using Google’s street view we can get a real-world, however probably not real-time impression of the target’s location.

We can also select the map provider and the associated view from a list that includes Google (satellite, street, hybrid views), Virtual Earth Maps (satellite, street, hybrid views) and openstreetmaps among others.





All the geolocation information that is retrieved for the respective “target” is being cached internally by Creepy. That means that the next time we will need to search for a specific target, only the newest tweets will be retrieved and analyzed and all former known locations will be loaded from cache. This way, we limit calls to twitter and Flickr API in order to minimize the possibility of hitting the API’s limits rendering the application unusable for a period of time, we minimize network usage and we ensure information persistence in case the “target” decides to erase some or in case of services being temporarily unavailable. Moreover, all errors are cached. As an example, if for some reason Creepy was not able to download and analyze a number of photographs from some image hosting service, it “remembers” those specific images and will try to access them the next time we will search for the specific “target”.

Apart from this internal caching, we have the option to export our results in the form of comma separated values (.cvs) file that can be imported to any other application. The export function (Creepy → Export as) can also export the location list as a kml file to be used with Google Earth software for further analysis.



Leave a reply


We're are building as a community and a team. Be a part of it.


©2018 Ztrela Knowledge Solutions Pvt. Ltd

Log in with your credentials

Forgot your details?