🔒
Es gibt neue verfügbare Artikel. Klicken Sie, um die Seite zu aktualisieren.
Ältere BeiträgeHaupt-Feeds

Databroker Files: New data set reveals 40,000 apps behind location tracking

15. Januar 2025 um 13:42

380 million location data from 137 countries: a previously unknown data set from a US data broker shows the dangers of global data trading. 40,000 apps are affected, including queer dating apps. For some apps, the shared location data is very precise.

An illustration with a smartphone watched over by a huge eye
Data about users of popular apps end up at data brokers – Handy: Pixabay; App-Icons: Freepik; Auge: Auge: maxpixel.net/CCO; Nebel: Vecteezy; Montage: netzpolitik.org

This is the English summary of a longer German-language article. The publication is part of the „Databroker Files“ series.

A new data set obtained from a US data broker reveals for the first time about 40,000 apps from which users‘ data is being traded. The data set was obtained by a journalist from netzpolitik.org as a free preview sample for a paid subscription. It is dated to a single day in the summer of 2024.

Among other things, the data set contains 47 million “Mobile Advertising IDs”, to which 380 million location data from 137 countries are assigned. In addition, the data set contains information on devices, operating systems and telecommunication providers.

Ths investigation is part of an international cooperation by the following media: Bayerischer Rundfunk/ARD (Germany), BNR Nieuwsradio (Netherlands), Dagens Nyheter (Sweden), Le Monde (France), netzpolitik.org (Germany), NRK (Norway), SRF/RTS (Switzerland) and WIRED (USA).

Overview of our findings

  • The approximately 40,000 apps in the new dataset cover a wide range of categories, from gaming, dating and shopping to news and education. They include some of the most popular apps worldwide, with millions of downloads in some cases.
  • For a smaller number of apps, the data set contains alarmingly precise location data. This data can help to identify a person’s place of residence. These apps include the queer dating app Hornet with more than 35 million users; the messaging app Kik with more than 100 million downloads in the Google Play Store alone; Germany’s most popular weather app Wetter Online, which also has more than 100 million downloads in the Google Play Store; and the flight tracking app Flightradar24 with more than 50 million downloads in the Googles Play Store; the app of German news site Focus Online and classifieds apps for German users (Kleinanzeigen) and French users (leboncoin).
  • For a bigger number of apps, less precise locations which appear to have been derived from IP addresses can be found in the data set. This list includes popular apps such as Candy Crush, Grindr, Vinted, Happy Color, dating apps Lovoo and Jaumo, news aggregator Upday, German email apps gmx.de and web.de as well as the popular dutch weather app Buienalarm.
  • Since the sample only covers one day, it is difficult to identify people based on their locations from this data set alone. However, in combination with other data sets from the advertising industry, which the research team obtained from data brokers, it’s possible to identify and track people on a large scale. The location data might for example provide clues to their home and work addresses.
  • Thus, the team was able to identify users of Wetter Online in Germany and Kik in Norway. The individuals confirmed that the data must belong to their devices and their use of the respective apps.
  • Location data aside, the mere information about who uses which apps can already be dangerous. For example the data set includes numerous Muslim and Christian prayer apps, health apps (blood pressure, menstruation trackers) and queer dating apps, which hint at special categories of personal data under GDPR.

Where did the data set come from?

The research team obtained the data set from US data broker Datastream Group, which now uses the name Datasys. The company did not respond to multiple requests for comment.

Contact with the data broker was established through Berlin-based data marketplace Datarade. The company states in response to inquiries that it does not host any data itself. According to a spokesperson „Data providers use Datarade to publish profiles and listings, enabling users to contact them directly“. Datarade „requires data providers to obtain valid consent in case they’re processing personal data and to aggregate or anonymize data in case they’re processing sensitive personal data“.

Where does the data originate?

According to our analysis, the data originates from Real Time Bidding (RTB), which is a process in the online advertising ecosystem. These are auctions in which advertising inventory of apps and websites is sold. In the process, apps and websites send data about their users to hundreds or thousands of companies. These data contains the information that we can see in our dataset. There have already been multiple warnings that advertising companies are collecting the data from RTB in order to sell it – often without the knowledge or explicit consent of the users or their apps.

What the apps say

None of the apps we confronted so far states they had business relations with Datastream Group / Datasys. The apps Hornet and Vinted for example wrote, that they cannot explain how their users‘ data ended up with data brokers. Queer dating app Hornet emphasizes that it does not share actual location data with third parties and announces an investigation. Other companies such as Kik, Wetter Online, Kleinanzeigen, Flightradar, Grindr and King, the company behind the game Candy Crush, did not respond to press inquiries.

Reactions

Experts from politics, government agencies and civil society expressed concern about the findings.

Michael Will, the Bavarian data protection commissioner, said there would be consequences. In an interview, he describes the findings of the investigation as disillusioning and alarming. The data protection official views the situation as a blatant breach of trust. “This is contrary to everything that the average users of apps would expect – to be able to track where they have been for months afterwards.” The data broker should not have had this data. ”This is beyond the agreed rules of the game.”

Will also expresses criticism on Real Time Bidding: anyone who uses it to display advertising must ask themselves whether their own contractual partners are really abiding by the contract. As a result of the investigation, the data protection authority wants to take action itself. “We have investigative powers. We will now make intensive use of them on the basis of the information you have provided,” says Will – and points out that his authority can also impose sanctions. “We have the option of imposing quite considerable fines.”

In view of the latest findings, the German Federal Ministry for Consumer Protection (BMUV) writes: The collection of the data alone must be prevented. “We need effective EU-wide protection against personalized advertising to prevent app providers from having incentives to collect more data than is necessary to offer an app.” The ministry continues to advocate a “consistent switch to alternative advertising models”.

In addition, the Ministry for Consumer Protection is calling for technical standards to prevent devices from collecting identifying data in the first place. “The manufacturers of operating systems and end devices also have a role to play here.” Finally, the supervisory authorities would need to take consistent action.

Michaela Schröder from the Federation of German Consumer Organizations (vzbv) comments: “The current findings show and confirm once again that the global online advertising market has escaped any control. Unscrupulous data traders collect and disseminate highly sensitive information about people, while websites and apps make these illegal practices possible in the first place and the supervisory authorities seem to be completely overwhelmed.”

Consumers are left defenceless against the massive risks posed by data trading, says Schröder. The vzbv is therefore calling for action at the European level. “It is long overdue for the European Commission to effectively protect consumers and present a proposal to ban personalized advertising – for example, through the announced Digital Fairness Act,” Schröder said.

Difference to Gravy Analytics leak

The results of our investigation confirm and expand on the insights that experts gained in early January from data obtained by hackers from US data broker Gravy Analytics. The Gravy Analytics leak also mentions thousands of apps; this data also apparently comes from Real Time Bidding. Among them are numerous apps that are also represented in our dataset from the Datastream Group: Candy Crush, Grindr, Kik, Wetter Online, Focus Online, FlightRadar24, Kleinanzeigen and many more.

However, the list of apps in our data set is much longer. For the first time, we can also differentiate between the apps for which only very rough location data is available and those for which users can be located exactly. It is this the precise location data that puts users at particular risk, as it allows to draw conclusions about their home address and movement patterns.


Die Arbeit von netzpolitik.org finanziert sich zu fast 100% aus den Spenden unserer Leser:innen.
Werde Teil dieser einzigartigen Community und unterstütze auch Du unseren gemeinwohlorientierten, werbe- und trackingfreien Journalismus jetzt mit einer Spende.

Depictions of child abuse: The internet forgets nothing, as long as it’s not supposed to forget

28. Januar 2022 um 14:05
Screen with blurred pictures
Police employees at a hotline for child abuse suspicions in North-Rhine Westfalia. – Alle Rechte vorbehalten IMAGO / Future Image

In December 2021, journalists from ARD political magazine Panorama and NDR format STRG_F (funk), together with Der Spiegel, revealed that photos and videos showing serious sexual abuse of children often remain on the net for years – even though investigating authorities could have them removed. Victim protection organizations and child and youth psychologists subsequently spoke of a „slap in the face of those affected“, a group of MEPs is now seeking clarification.

In their guest article, Robert Bongen and Daniel Moßbrucker, who were part of the research team, describe why a change of strategy in the fight against child abuse would not only be an investigative tactical question, but also a political one.

In one of his last appearances in November 2021, the then acting Federal Minister of the Interior, Horst Seehofer, became unusually emotional. At the autumn conference of the Federal Criminal Police Office (BKA), Seehofer highlighted an area of great personal concern to him: the spread of depictions of sexual abuse of children. The number of such depictions increases enormously from year to year, Seehofer said – and this trend must be stopped by all means. Because behind the depictions there is almost always real abuse:

What immeasurable suffering the perpetrators inflict on the children! Under no circumstances should the image or video material be permanently retrievable. Otherwise, those affected will become victims again and again, and this for a lifetime. The deletion of these pictures and videos is therefore indispensable.

There is really nothing to add to that. And yet, the vehemence of these words surprised us. Because at that time we had already completed our research. And only a few days later we reported that the Federal Criminal Police Office of all places – which as the central agency in Germany has special rights and duties in the fight against child abuse – has for years at best only partially fulfilled Seehofer’s demand.

Photos and videos are mostly filed with ordinary storage services, the download links shared in darknet forums. Yet the BKA does not systematically report these links to these storage services, even though this would mean that illegal material documenting serious child abuse would disappear from the net.

How is it possible that an authority that is subordinate to the Federal Ministry of the Interior fails to do exactly what the responsible minister publicly calls „indispensable“?

(Almost) everything is allowed

The world in which the investigators of the BKA move is disturbing. One example: Everyone who registers with an online service on the internet usually has to accept the „general terms and conditions“. These are often so long and complex that no one reads them.

On the world’s largest platform in the Darknet, where paedo-criminals meet, it is different. Here, the terms and conditions consist of only one sentence: If you want to join the forum, you should never post personal information about yourself. Otherwise, one suspects, everything is allowed here, as long as it is meeting with ever growing approval: In this forum alone, around 3.7 million user accounts were registered at the end of 2021, under the dubious slogan: „For Child Lovers“.

Investigative authorities let the forums grow

This forum was the focus of our research. Never before in the history of the internet has there been a larger platform of this kind. By way of comparison, the „Boystown“ forum, which German authorities were able to shut down in April 2021, had „only“ around 400,000 user accounts at the end. One should not equate these numbers with people, because many paedo-criminals open a new account with every login, which they never use again afterwards. Nevertheless, there is much to suggest that these darknet forums have managed to record enormous growth rates, especially in the last two to three years. At the time of our research, the users of the platform with the slogan „For Child Lovers“ had clicked on the content there about 1.7 billion times, and the trend was rising rapidly.

There are technical reasons for the rise of these forums, for example, the Tor network behind them has become faster and faster in recent years. But above all – and this is the central finding of our research – investigating authorities from many countries, including the German Federal Criminal Police Office, are involuntarily making these platforms more and more attractive. Because there is more and more content available for download – without the investigators intervening.

Darknet takes over the mediating role

According to our data analyses, the largest forum „For Child Lovers“ alone had over 20 terabytes – that’s over 20,000 gigabytes – available for download in November. And this amount could be deleted within a very short time.

This is due to the special architecture of these paedo-criminal networks. The operators of the paedo-platforms operate in the anonymous darknet in order to build a digital meeting place for paedo-criminals. However, the volume of data that can be exchanged there by anyone is too large to be stored on the darknet platforms themselves. Therefore, the paedo-criminals choose storage services on the ordinary internet instead, so-called one-click filehosters.

They put their material in a folder, which they encrypt with a password as a so-called archive, and upload this archive to the file hoster. From the file hoster’s point of view, an encrypted mountain of data has been uploaded. In the darknet forum, the paedophiles then share the corresponding download link and the associated password. The file hosters are usually unaware of this because possible upload filters are not effective due to the password protection.

According to the law, they do not have to look for it themselves. What’s more, it would even be illegal if the operators of a file hoster were to specifically search the internet for photos and videos showing child abuse. The file hosters are therefore dependent on receiving a tip-off.

97 percent of the content is on the Clearweb

The Canadian Center for Child Protection (C3P) estimates that only about three percent of the photos and videos documenting child abuse are hosted on the darknet itself. Ninety-seven percent, on the other hand, are on the Clearweb, the part of the Internet that can be accessed with ordinary browsers such as Firefox or Chrome. The Canadian charity organization C3P maintains, among other things, a hotline where suspected cases of child abuse can be reported. In addition, the organization specifically searches for depictions of abuse and reports them to the responsible internet companies.

According to its own account, C3P is also active in the darknet and searches forums for linked archives that lead to Clearweb hosts. So does the NGO take over the job, so that law enforcement agencies can take care of identifying the perpetrators? German security authorities are well aware of their work; some investigators even speak of the „Canadian approach“. How is it possible that in the largest of these darknet forums, of all places, no one systematically reports the content. C3P answers evasively when asked. Generally, it states:

We do agree with your finding that not enough is being done and this issue is largely being ignored. As your reporting points out, this lack of action continues to put victims in harm’s way and perpetuates trauma.

An NGO is to clean up the internet

In addition, C3P says it wants to decrypt all content first and check it carefully to make sure that only illegal material is reported. Filehosters must be able to trust that only illegal material will be reported. But this takes time. “As a relatively small charity, we have limited resources to tackle the volume of media we find“, C3P writes.

Indeed, it seems grotesque that more than 30 years after the development of the World Wide Web, a single non-profit organization with a small team and budget should have the task of cleansing the entire Internet of abusive depictions. C3P, which operates independently but is about half dependent on mostly Canadian government funding, regularly calls on internet companies to take more consistent action against the content on their servers. A problem for the internet industry, then?

Companies take over 13 terabytes off the net

Initially, we had assumed that the responsibility for the masses of content linked in the darknet forums was to be found with the filehosts. However, when we randomly sent some links to the services and often received the info after a few minutes that everything had been removed, the narrative of the disreputable, anonymous one-click filehosters began to crumble. In the end, in a concentrated effort, we automatically collected around 80,000 working links that paedo-criminals had posted on one platform. Behind them lay around 13.55 terabytes of data. That’s about as much as if a person watched a video of a child being abused for a year, day and night, in HD quality.

After we sent the links to the filehosters, they removed the material from their servers after 48 hours at the latest. Before that, the content had been online for about one and a half years on average. The oldest link we found had been online for over six years and led to a video of a boy being abused and raped.

We sent a German image-hoster links to about 100,000 photos, which it took down from its servers within three hours. Some of these photos had also been lying dormant on the servers for years without the hoster having been informed by German authorities, according to his statement.

Official statistics with an opaque data basis

These figures are in enormous discrepancy to the amount of content that the Federal Criminal Police Office (BKA) retrieves from the net every year. Since a decision in 2011 by the Bundestag, the German parliament, the BKA has been required to have illegal content deleted from the internet as comprehensively as possible. Since then, the approach of „deleting instead of blocking“ has been considered an alternative to netblocking, which was heavily discussed under the title „Zensursula“ in reference to the then Family Minister Ursula von der Leyen, because critics feared that this would lead to state censorship of the internet. (The German word for censorship is “Zensur”.)

According to the annual report, the Federal Criminal Police Office (BKA) followed up on about 6800 tips on links hosting child abuse images. In most cases, the prosecutors succeeded in having the links deleted within a few days. In view of such figures, the approach of „deleting instead of blocking“ is also considered successful in politics. The then Minister of Justice and Family Affairs, Christine Lambrecht, commented on the BKA’s annual report accordingly:

The high deletion rates and the comparatively short processing times prove that the concept of ‚deletion instead of blocking‘ is  overall effective.

However, the BKA’s approximately 6800 cases that appear in the official statistics are only a fraction of the masses we found in the large darknet forums. Many links are apparently not reported by the BKA in the first place. Why not?

The myth of the „servers abroad”

In the course of our research, we did not come across a single file- or image-hoster worldwide who did not respond to our reports. The hosters who were most abused by the paedo-criminals are located in Germany, France, Sweden, Iceland or do not state their place of business at all – but they all cooperated.

Surprising – but also irritatingly quick and easy. Why don’t the law enforcement agencies do this? In conversations with investigators, we heard again and again that the „servers abroad“ were a central problem. Whether this applies to other areas of law, such as copyright infringement or online fraud, is something we cannot judge. However, for the linked content circulating in the darknet forums for the exchange of child abuse material, the server location is definitely not an obstacle.

As far back as 2009, the Scientific Service of the Bundestag stated that the BKA could send an „Abuse E-Mail“ to foreign services to inform them about illegal content. Should the BKA not be able or willing to inform the foreign hosters directly, they could, in case of doubt, ask foreign law enforcement agencies for assistance.

Catch perpetrators, leave photos?

Although it would be technically possible and legally permissible (some lawyers even consider it mandatory), the German Federal Criminal Police Office, BKA, hardly ever reports illegal content to the filehosters. Nor, by the way, do the few German storage services. Particularly striking: Even after the BKA, together with the General Public Prosecutor’s Office in Frankfurt am Main, shut down the „Boystown“ forum in April 2021, it did not remove the links shared there. As a result, a few days after the takedown, paedo-criminals simply shared the links that continued to work in another forum, so that much of the content on „Boystown“ was available again.

This is incomprehensible for child and youth psychologists and the German Child Protection Association. They were „stunned“ that the BKA apparently failed to systematically delete images from the internet, they said in a joint statement in response to the research. This is a „slap in the face of those affected“:

The fact that images of their terrible experiences are still available on the internet is extremely stressful and makes it more difficult for them to come to terms with what they have experienced. Some victims speak of renewed abuse as soon as someone looks at the footage of their abuse.

But the current federal government also defends the procedure. In December, the Secretary of State for the Interior, Markus Richter, answered a written question (Schriftliche Frage) by Konstantin von Notz, a member of the Green Party: „Priority must be given to securing and evaluating the content that is needed for immediate measures to avert danger and for the presentation of criminal evidence.“

This is a typical argument used by prosecutors in the discussion: The priority is to catch the perpetrators and collect evidence for a conviction. In the best case, this could even save children who are still being abused. In contrast, the mere deletion of photos and videos, which can be uploaded elsewhere anyway, actually seems less relevant.

But is it really this sad dilemma that the authorities are facing?

Deletion does not hinder law enforcement, it supports it

On closer analysis, it becomes clear that there is nothing to be said against catching perpetrators and saving children as well as reporting content. This „either/or“ does not exist in this radical form. What’s more, acts could probably even be prevented if deletions were more consistent.

Because with all the content, the forums have long since become a social space in which paedo-criminals are given the impression that it’s normal to perform sexual acts on children, and that the children actually want to do so. Law enforcement officers investigating this area told us during the research that they noticed a disinhibition on these platforms, so that possibly more people „feel like“ trying things in real life. This was also confirmed by child and youth psychologists and the German Child Protection Association in their reaction to our reporting.

One cannot claim that it is time-consuming to search for the links. We needed about six hours with an online crawler, a kind of „search dog for the net“, to find the 80,000 links that led to over 13 terabytes. In fact, we were quite surprised at how easy it was to collect the links, as the paedo-criminals had no safeguards at all against automated downloads in the forum at the time.

Does deletion really destroy evidence?

It is also possible to preserve evidence if it is deleted. There is nothing to be said against automatically clicking on the automatically downloaded links and downloading the linked content once. Police officers could then view the photos and videos at a later time, but by reporting them to the filehoster, they could already contain the distribution. The risk that legal content would be reported as „collateral damage“ (so-called „overblocking“) is de facto excluded: These forums are only there to distribute child abuse material, other content is not tolerated there.

But what good is deletion if paedo-criminals still have the material on their hard drives and can upload it again at any time? We heard this argument again and again. It is not completely wrong, but here too our research suggests a more differentiated understanding: Deleting takes the dynamic out of the forums because an upload takes significantly longer than collecting a link. The users usually use the slower Tor network for the upload so as not to reveal their true IP address to the Clearweb filehoster. Furthermore, the uploaders all live normal lives and are not always in the forum every day to even notice that their links have been deleted.

Annoying users “to death”

Incidentally, we received unexpected confirmation from the administrator of the largest forum at the time: We managed to get in touch with him. He wrote to us that consistent deletion can „annoy users to death“. If you delete long enough, it can happen that people leave and the administrators „shut the place down“. In other words, uploaders are annoyed because their work is destroyed, and consumers are annoyed because many links lead nowhere.

Consistently and permanently pursued, deletion would also help the internet companies to block the once reported content from being uploaded again. Some of the filehosters currently most abused by paedo-criminals already have such upload filters in place. They produce a so-called hash value from reported content and store it in a database.

If a user tries to upload a file with the same hash value again, the upload stops immediately. This even works for the encrypted archive files of the paedo-criminals – but only if the filehosters receive a tip from authorities about which specific files with the associated hash values are illegal.

Not deleting may increase the number of perpetrators

Of course, particularly motivated users could always upload their material to new file hosts with different hash values. But how strong is this argument when not deleting undoubtedly leads to more and more people being able to store more and more material on their hard drives in order to upload it? So not deleting broadens the base of those who have the material on their hard drives, thus creating new uploaders.

This circumstance, the recently increasing dimensions of these darknet forums, is probably the decisive change that makes the priorities of the Federal Ministry of the Interior and the law enforcement agencies, which are perhaps understandable in principle, seem increasingly absurd: How wisely are resources used when, as in the „Boystown“ case, four people are arrested after months of investigation, but then the content is not secured and taken off the net? So that only a few days after the shutdown, „Boystown“ content is posted again elsewhere and thousands of users can still use it today?

There may have been a time, when this was exactly the right strategy: shutting down a forum in order to eliminate an entire infrastructure with a successful investigation. But today, these forums are teeming, all blithely linking to content hidden on the Clearweb.

Our research shows that identical links to filehosters are often shared in different darknet forums. Conversely, this means that deleting the links of particular filehosters can remove the content from several darknet forums at once. The result is an efficient procedure to get criminal content off the net.

New government wants to strengthen the BKA

Horst Seehofer must simply not have known that the Federal Criminal Police Office apparently structurally fails to do exactly this when he so vehemently emphasized at the BKA autumn meeting that deleting the recordings was „indispensable“. In its coalition agreement, the new government has now announced: „In the fight against child abuse, we will strengthen the Federal Criminal Police Office in terms of personnel.“

It is now a political question how Seehofer’s successor Nancy Faeser (SPD) will distribute these resources within the BKA. If she takes „deleting depictions of abuse“ seriously as an essential part of the fight against child abuse, then she could achieve a great deal here with relatively little money and staff.

Robert Bongen works for Norddeutscher Rundfunk and is an editor for the ARD political magazine Panorama.

Daniel Moßbrucker works as a journalist on the topics of surveillance, data protection and internet regulation. He is also a trainer for digital security and darknet research.


Hilf mit! Mit Deiner finanziellen Hilfe unterstützt Du unabhängigen Journalismus.

  • Es gibt keine weiteren Artikel
❌