6 Mei 2026

Big Data vs Privacy: Where is the Ethical Limit of IoT in Spying on Consumer Behavior?

Harry Yulianto

Lecturer, STIE YPUP Makassar

Keywords: Big Data, Consumer Privacy, Ethical Boundaries, Internet of Things (IoT).

WIN Media, OpinionImagine your smart home assistant accidentally recording a private conversation and sending it to a third party, or a smartwatch detecting its user’s pregnancy before the owner herself realizes it. These scenarios are no longer science fiction—they are real consequences of an Internet of Things (IoT) ecosystem that prioritizes limitless connectivity. Amid the dazzling convenience on offer, a fundamental question emerges that we must reflect on: are we truly enjoying digital services, or are we slowly becoming the “product” being bought and sold?

In Indonesia, IoT growth has surged dramatically, with millions or even billions of connected devices—from sensors in modern retail to smart home appliances—steadily eroding the boundary between private and public spaces. In this context, big data acts as the “new oil,” driving service personalization while simultaneously creating a mass surveillance ecosystem that operates without adequate control (Zuboff, 2019). Ironically, the efficiency and comfort gained often come at a high cost: the systematic erosion of consumer privacy rights.

The line between innovation that brings benefits and privacy violations is increasingly blurred, especially when regulation and ethical awareness fail to keep pace with the speed of technological expansion. As a result, consumers frequently find themselves in a vulnerable position, becoming victims of business models that prioritize limitless data extraction for short-term profit. Yet without protections that uphold individual rights, technological progress risks giving rise to new forms of hidden exploitation behind the narrative of personalization.

Between Personalization and Exploitation

There is no denying that the Internet of Things (IoT) has opened extraordinary opportunities for businesses to understand consumers in real time, creating more personal and relevant experiences. Imagine smart shelves in modern retail automatically offering discounts based on a customer’s shopping history, or an insurance policy whose premium is adjusted according to safe driving behavior through vehicle telematics. These benefits show how data ecosystems can improve efficiency and comfort, as long as they remain within clear ethical boundaries (Gawer, 2021).

However, behind the allure of personalization, data collection practices often rely on sham notice-and-consent mechanisms, where users are forced to agree to lengthy terms they never truly read. Moreover, we have entered the era of surveillance capitalism, where behavioral data is used not only to predict needs but also to manipulate consumer decisions—even before consumers themselves realize what they want (Zuboff, 2019). Isn’t it ironic that data intended to improve service quality instead becomes a commodity sold to third parties without transparency?

The most alarming consequence of this imbalance is the loss of consumer control over their own data, with no opt-out mechanism as easy and widespread as the opt-in mechanisms offered. Data collected through sensors, hidden cameras, or microphones in smart devices is highly intrusive because it is gathered in private spaces that should be the most protected domain. When devices in our own homes become invisible spies, where can we draw the line between convenience and total surrender?

Case Studies & Realities

On the global stage, privacy scandals involving smart devices are no longer merely theoretical concerns—they have become realities exposed across multiple countries. The case of household smart device data being sold to insurance companies, which then raised premiums based on domestic activity, shows how private data can be turned into an economic weapon that harms users. Likewise, the scandal of voice data collected by virtual assistants being listened to and analyzed by human staff—without user knowledge—raises a fundamental question: how much of our most intimate privacy have we surrendered for momentary convenience (Möslein, 2023)?

In Indonesia, although the IoT ecosystem is growing rapidly, legal awareness and consumer protection still lag far behind, evidenced by the scarcity of lawsuits related to IoT privacy due to low digital literacy and weak law enforcement. Practices such as shopping centers using facial recognition technology without explicit visitor consent, or on-demand applications demanding 24/7 location access for disproportionate reasons, have become everyday sights that rarely face serious challenge. Does this not show that Indonesian consumers are in a hazardous gray zone, where basic rights are eroded by irresponsible business practices (Setiawan & Nugroho, 2024)?

Research data reveals that more than 70 percent of consumers are actually unaware of the extent to which their data is collected and utilized by the IoT devices they use daily. In Indonesia itself, the National Cyber and Crypto Agency (BSSN) has recorded a significant increase in data breach incidents involving smart devices and the digital ecosystem, with a persistently alarming trend year after year (BSSN, 2024). These numbers are not merely statistics; they are alarms indicating that without serious intervention, consumers will continue to be victims of a deepening information asymmetry.

Ethical Dilemmas and Overlapping Regulations

IoT startup founders often stand at a difficult crossroads: should they sacrifice user privacy in pursuit of rapid growth and meeting investor expectations? Although deep personalization may seem to promise short-term competitive advantage, this approach risks creating a feeling of being “watched” that ultimately erodes consumer trust in the long run (Crawford & Schultz, 2024). Shouldn’t business sustainability be built on a foundation of trust rather than on exploitative data extraction?

Indonesia’s Personal Data Protection Law (UU PDP) has been enacted as a legal umbrella, yet its implementation and oversight mechanisms remain far from adequate, especially when addressing the complexity of the IoT ecosystem. Classic questions arise: are sensors installed in public spaces categorized as collecting personal data, and who is responsible when dozens of smart devices within one ecosystem share data without clear boundaries (Wahyuni & Hidayat, 2024)? Without firm legal certainty, consumers remain trapped in a hazardous gray zone.

In Europe, the General Data Protection Regulation (GDPR) takes a strict approach through the principle of privacy by design, requiring data protection to be integrated from the earliest stages of product development—a standard that still feels foreign in Indonesia, with its reactive regulatory approach. We cannot simply copy foreign regulations wholesale without building a collective awareness of data ethics rooted in our own local context (Purnomo & Lestari, 2023). Shouldn’t every nation have digital sovereignty that not only regulates but also cultivates a culture of respecting privacy as a fundamental right?

Collective Responsibility of Three Pillars

Regulators must act more decisively by tightening specific consent mechanisms for each type of data collected through IoT devices, no longer allowing broad, manipulative consent practices. Furthermore, the obligation to conduct privacy impact assessments before mass-scale IoT product launches must be urgently implemented as a preventive measure that has so far been absent in our digital ecosystem (Nugroho & Sari, 2024). Isn’t it time for the state to act not merely as a supervisor but as a proactive protector of its citizens’ fundamental rights?

For businesses, data ethics should not be seen as a burden but as a golden opportunity to become a unique selling point amid widespread, troubling data exploitation practices. By prioritizing transparency and giving users full control over their data, companies can build differentiation based on trust while also applying the principle of data minimization—collecting only the data truly necessary for core services (Hartono, 2023). Isn’t a business that endures one that is respected, not merely one that collects the most data?

On the other hand, consumers must no longer remain passive victims; it is time to raise digital literacy by reading each access permission requested, utilizing available privacy features, and daring to question how service providers use their data. Adopting the principle “not everything smart needs to be installed in the home” is a simple yet significant first step toward regaining control over our private spaces (Widiastuti & Firmansyah, 2024). When we begin to choose critically, won’t the market be driven to provide more responsible products?

Demand Ethical Boundaries

The Internet of Things and big data are powerful forces that have brought extraordinary efficiency to modern life, yet without ethics and regulations that prioritize privacy, we are merely building a world that is extremely comfortable yet extremely vulnerable to hidden exploitation. If we remain passive consumers, are we not complicit in dismantling the boundaries of privacy that should be among humanity’s most fundamental rights? It is time we dare to demand clear ethical limits before technology completely erases private space from the landscape of our lives.

Related News