Showing posts with label big data. Show all posts
Showing posts with label big data. Show all posts

18 September 2022

Neo-Medievalism and the New Latin

By Emile Wolfaardt

Medieval Latin (or Ecclesiastical Latin, as it is sometimes called), was the primary language of the church in Europe during the Dark Ages. The Bible and its laws and commands were all in Latin, as were the punishments to be meted out for those who breached its dictates. This left interpretation and application up to the proclivities of the clergy. Because the populace could not understand Latin, there was no accountability for those who wielded the Latin sword.

We may have outgrown the too-simplistic ideas of infanticidal nuns and the horror stories of medieval torture devices (for the most part, anyway). Yet the tragedy of the self-serving ecclesiastical economies, the gorgonising abuse of spiritual authority, the opprobrious intrusion of privacy, and disenfranchisement of the masses still cast a dark shadow of systemic exploitation and widespread corruption over that period. The few who birthed into the ranks of the bourgeois ruled with deleterious absolutism and no accountability. The middle class was all but absent, and the subjugated masses lived in abject poverty without regard or recourse. There was no pathway to restation themselves in life. It was effectively a two-class social stratification system that enslaved by keeping people economically disenfranchised and functionally dependent. Their beliefs were defined, their behavior was regulated, and their liberties were determined by those whose best interest was to keep them stationed where they were.

It is the position of this writer that there are some alarming perspectives and dangerous parallels to that abuse in our day and age that we need to be aware of.

There has been a gargantuan shift in the techno-world that is obfuscatious and ubiquitous. With the ushering in of the digital age, marketers realised that the more information they could glean from our choices and conduct, the better they could influence our thinking. They started analysing our purchasing history, listening to our conversations, tracking key words, identifying our interests. They learned that people who say or text the word ‘camping’ may be in the market for a tent, and that people who buy rifles, are part of a shooting club, and live in a particular area are more likely to affiliate with a certain party. They learned that there was no such thing as excess data – that all data is useful and could be manipulated for financial gain.

Where we find ourselves today is that the marketing world has ushered in a new economic model that sees human experiences as free raw material to be taken, manipulated, and traded at will, with or without the consent of the individual. Google's vision statement for 2022 is ‘to provide access to the world's information in one click’. Everything, from your heart rate read by your watch, your texts surveyed by your phone’s software, your words recorded by the myriad listening devices around you, your location identified by twenty apps on your phone, your GPS, your doorbell, and the security cameras around your home are garnering your data. And we even pay for these things. It is easier to find a route using a GPS than a map, and the convenience of a smart technology seems, at first glance anyway, like a reasonable exchange.

Our data is being harvested systematically, and sold for profit without our consent or remuneration. Our search history, buying practices, biometric data, contacts, location, sleeping habits, exercise routine, self-discipline, articles we pause our scrolling to peruse, even whether we use exclamation marks in our texts – the list continues almost endlessly – and a trillion other bits of data each day is recorded. Then it is analysed for behavioural patterns, organised to manipulate our choices, and sold to assist advertisers to prise the hard-earned dollars out of our hands. It is written in a language very few people can understand, imposed upon us without our understanding, and used for financial gain by those who do not have our best interest at heart. Our personal and private data is the traded for profit without our knowledge, consent, or benefit.

A new form of economic oppression has emerged, ruthlessly designed, implemented by the digital bourgeois, and built exclusively on harvesting our personal and private data – and we gladly exchanged it for the conveniences it offered. As a society, we have been gaslighted into accepting this new norm. We are fed the information they choose to feed us, are subject to their manipulation, and we are simply fodder for their profit machine. We are indeed in the oppressive age of Neo-Medievalism, and computer code is the new Latin.

It seems to have happened so quickly, permeated our lives so completely, and that without our knowledge or consent.

But it is not hopeless. As oppressive as the Dark Ages were, that period came to an end. Why? Because there were people who saw what was happening, vocalised and organised themselves around a healthier social model, and educated themselves around human rights, oppression, and accountable leadership. After all – look at us now. We were birthed out of that period by those who ushered in the Enlightenment and ultimately Modernity.

Reformation starts with being aware, with educating oneself, with speaking up, and with joining our voices with others. There is huge value to this digital age we have wholeheartedly embraced. However, instead of allowing it to oppress us, we must take back control of our data where we can. We must do what we need to, to maximise the opportunities it provides, join with those who see it for what it is, help others to retain their freedom, and be a part of the wave of people and organisations looking for integrity, openness, and redefinition in the process. The digital age with its AI potential is here to stay. This is good. Let’s be a part of building a system that serves the needs of the many, that benefits humanity as a whole, and that lifts us all to a better place.

24 April 2022

The Dark Future of Freedom

by Emile Wolfaardt

Is freedom really our best option as we build a future enhanced by digital prompts, limits, and controls?

We have already surrendered many of our personal freedoms for the sake of safety – and yet we are just on the brink of a general transition to a society totally governed by instrumentation. Stop! Please read that sentence again! 

Consider for example how vehicles unlock automatically as authorised owners approach them, warn drivers when their driving is erratic, alter the braking system for the sake of safety and resist switching lanes unless the indicator is on. We are rapidly moving to a place where vehicles will not start if the driver has more alcohol in their system than is allowed, or if the license has expired or the monthly payments fall into arrears.

There is a proposal in the European Union to equip all new cars with a system that will monitor where people drive, when and above all, at what speed. The date will be transmitted in real time to the authorities.

Our surrender of freedoms, however, has advantages. Cell-phones alert us if those with contagions are close to us, and Artificial Intelligence (AI) and smart algorithms now land our aeroplanes and park our cars. When it comes to driving, AI has a far better track record than humans. In a recent study, Google claimed that its autonomous cars were ‘10x safer than the best drivers,’ and ‘40x safer than teenagers.’ AI promises, reasonably, to provide health protection and disease detection. Today, hospitals are using solutions based on Machine Learning and Artificial Intelligence to read scans. Researchers from Stanford developed an algorithm to assess chest X-rays for signs of disease. This algorithm can recognise up to fourteen types of medical condition – and was better at diagnosing pneumonia than several expert radiologists working together.

Not only that, but AI promises to both reduce human error and intervene in criminal behavior. PredPol is a US based company that uses Big Data and Machine Learning to predict the time and place of a potential offence. The software looks at existing data on past crimes and predicts when and where the next crime is most likely to happen – and has demonstrated a 7.4% reduction in crime across cities in the US and created a new avenue of study in Predictive Policing. It already knows the type of person who is likely to commit the crime and tracks their movement toward the place of anticipated criminal behavior.

Here is the challenge – this shift to AI, or ‘instrumentation’ as it is commonly called, has been both obfuscatious and ubiquitous. And here are the two big questions about this colossal shift that nobody is talking about.

Firstly, the entire move to the instrumentation of society is predicated on the wholesale surrender of personal data. Phone, watches, GPS systems, voicemails, e-mails, texts, online tracking, transactions records, and countless other instruments capture data about us all the time. This data is used to analyse, predict, influence, and control our behaviour. In the absence of any governing laws or regulation, the Googles, Amazons, and Facebooks of the world have obfuscated the fact that they collect hundreds of billions of bits of personal data every minute – including where you go, when you sleep, what you look at on your watch or phone or other device, which neighbour you speak to across the fence, how your pulse increases when you listen to a particular song, how many exclamation marks you put in your texts, etc. and they collect your data whether or not you want or allow them to.

Opting out is nothing more than donning the Emperor’s new clothes. Your personal data is collated and interpreted, and then sold on a massive scale to companies without your permission or remuneration. Not only are Google, Amazon and Facebook (etc.) marketing products to you, but they are altering you, based on their knowledge of you, to purchase the products they want you to purchase. Perhaps they know a user has a particular love for animals, and that she bought a Labrador after seeing it in the window of a pet store. She has fond memories of sitting in her living room talking to her Lab while ‘How Much is that Doggy in the Window’ played in the background. She then lost her beautiful Labrador to cancer. And would you know it – an ad ‘catches her attention’ on her phone or her Facebook feed with a Labrador just like hers, with a familiar voice singing a familiar song taking her back to her warm memories, and then the ad turns to collecting money for Canine Cancer. This is known as active priming.

According to Google, an elderly couple recently were caught in a life-threatening emergency and needed to get to the doctor urgently. They headed to the garage and climbed into their car – but because they were late on their payments, AI shut their car down – it would not start. We have moved from active priming into invasive control.

Secondly, data harvesting has become so essential to the business model that it is already past the point of reversal. It is ubiquitous. When challenged about this by the US House recently, Mark Zuckerberg offered that Facebook would be more conscientious about regulating themselves. The fox offered to guard the henhouse. Because this transition was both hidden and wholesale, by the time lawmakers started to see the trend it was too late. And too many Zuckerbucks had been ingested by the political system. The collaboration of big data has become irreversible – and now practically defies regulation.

We have transitioned from the Industrial Age where products were developed to ease our lives, to the Age of Capitalism where marketing is focused on attracting our attention by appealing to our innate desire to avoid pain or attract pleasure. We are now in what is defined as the Age of Surveillance Capitalism. In this sinister market we are being surveilled and adjusted to buy what AI tells us to buy. While it used to be true that ‘if the service is free, you are the product,’ it is now more accurately said that ‘if the service is free, you are the carcass ravaged of all of your personal data and freedom to choose.’ You are no longer the product, your data is the product, and you are simply the nameless carrier that funnels the data.

And all of this is marketed under the reasonable promise of a more cohesive and confluent society where poverty, disease, crime and human error is minimised, and a Global Base Income is being promised to everyone. We are told we are now safer than in a world where criminals have the freedom to act at will, dictators can obliterate their opponents, and human errors cost tens of millions of lives every year. Human behaviour is regulated and checked when necessary, disease is identified and cured before it ever proliferates, and resources are protected and maximised for the common betterment. We are now only free to act in conformity with the common good.

This is the dark future of freedom we are already committed to – albeit unknowingly. The only question remaining is this – whose common good are we free to act in conformity with? We may have come so far in the subtle and ubiquitous loss of our freedoms, but it may not be too late to take back control. We need to self-educate, stand together, and push back against the wholesale surrender of our freedom without our awareness.

23 June 2019

The world in crisis: it’s not what we think

Posted by Thomas Scarborough

The real danger is an explosion - of Big Data

We lived once with the dream of a better world: more comfortable, more secure, and more advanced.  Political commentator Dinesh D’Souza called it ‘the notion that things are getting better, and will continue to get better in the future’.  We call it progress.  Yet while our world has in many ways advanced and improved, we seem unsure today whether the payoff matches the investment.  In fact, we all feel sure that something has gone peculiarly wrong—but what?  Why has the climate turned on us?  Why is the world still unsafe?  Why do we still suffer vast injustices and inequalities?  Why do we still struggle, if not materially, then with our sense of well-being and quality of life?  Is there anything in our travails which is common to all, and lies at the root of them all?

It will be helpful to consider what it is that has brought us progress—which in itself may lead us to the problem.  There have been various proposals:  that progress is of the inexorable kind; that it is illusory and rooted in the hubristic belief that earlier civilisations were always backward; or it is seen as a result of our escape from blind authority and appeal to tradition.  Yet above all, progress is associated with the liberating power of knowledge, which now expands at an exhilarating pace on all fronts.  ‘The idea of progress,’ wrote the philosopher Charles Frankel, ‘is peculiarly a response to ... organized scientific inquiry’.

Further, science, within our own generation, has quietly entered a major new phase, which began around the start of the 21st Century.  We now have big data, which is extremely large data sets which may be analysed computationally.

Now when we graph the explosion of big data, we interestingly find that this (roughly) coincides on two axes with various global trends—among them increased greenhouse gas emissions, sea level rise, economic growth, resource use, air travel—even increased substance abuse, and increased terrorism.  There is something, too, which seems more felt than it is demonstrable.  A great many people sense that modern society burdens us—more so than it did in former times.

Why should an explosion of big data roughly coincide—even correlate—with an explosion of global travails?

On the one hand, big data has proved beyond doubt that it has many benefits.  Through the analysis of extremely large data sets, we have found new correlations to spot business trends, prevent diseases, and combat crime—among other things.  At the same time, big data presents us with a raft of problems: privacy concerns, interoperability challenges, the problem of imperfect algorithms, and the law of diminishing returns.  A major difficulty lies in the interpretation of big data.  Researchers Danah Boyd and Kate Crawford observe, ‘Working with Big Data is still subjective, and what it quantifies does not necessarily have a closer claim on objective truth.’  Not least, big data depends on social sorting and segmentation—mostly invisible—which may have various unfair effects.

Yet apart from the familiar problems, we find a bigger one.  The goal of big data, to put it very simply, is to make things fit.  Production must fit consumption; foodstuffs must fit our dietary requirements and tastes; goods and services must fit our wants and inclinations; and so on.  As the demands for a better fit increase, so the demand for greater detail increases.  Advertisements are now tailored to our smallest, most fleeting interests, popping up at every turn.  The print on our foodstuffs has multiplied, even to become unreadable.  Farming now includes the elaborate testing and evaluation of seeds, pesticides, nutrients, and so much more.  There is no end to this tendency towards a better fit.

The more big data we have, the more we can tailor any number of things to our need:  insurances, medicines, regulations, news feeds, transport, and so on.  However, there is a problem.  As we increase the detail, so we require great energy to do it.  There are increased demands on our faculties, and on our world—not merely on us as individuals, but on all that surrounds us.  To find a can of baked beans on a shop shelf is one thing.  To have a can of French navy beans delivered to my door in quick time is quite another.  This is crucial.  The goal of a better fit involves enormous activity, and stresses our society and environment.  Media academic Lloyd Spencer writes, ‘Reason itself appears insane as the world acquires systematic totality.’  Big data is a form of totalitarianism, in that it requires complete obedience to the need for a better fit.

Therefore the crisis of our world is not primarily that of production or consumption, of emissions, pollution, or even, in the final analysis, over-population.  It goes deeper than this.  It is a problem of knowledge—which now includes big data.  This in turn rests on another, fundamental problem of science: it progresses by screening things out.  Science must minimise unwanted influences on independent variables to succeed—and the biggest of these variables is the world itself.

Typically, we view the problems of big data from the inside, as it were—the familiar issues of privacy, the limits of big data, its interpretation, and so on.  Yet all these represent an enclosed view.  When we consider big data in the context of the open system which is the world, its danger becomes clear.  We have screened out its effects on the world—on a grand scale.  Through big data, we have over-stressed the system which is planet Earth.  The crisis which besets us is not what we think.  It is big data.



The top ten firms leveraging Big Data in January 2018: Alphabet, Amazon, Microsoft, Facebook, Chevron, Acxiom, National Security Agency, General Electric, Tencent, Wikimedia (Source: Data Science Graduate Programs).


Sample graphs. Red shade superimposed on statistics from 2000.