Buzzword Wiki

In this list you can find IT buzzwords and controversies in a semi chronological order. A click on each term shows you short explanations and possible positions that the different players in society would typically adopt.

Introduction

•••  ••   ••  •••   !  !!

The white box shows explanations and links. The dots above indicate the range of risks (red) versus chances (green). The blue explamation marks stand for the likelyhood of ethical conflicts.The blue box contains conservative, economy-focused or authoritarian positions.The green box contains progressive, liberal and often consumer-focused positions.

Clickworkers’ labor conditions – 2018

•••  ••  •  ••  •••    !  !!

People in low-wage countries doing the sweeping work to erase potentially harmful content from social media platforms, often under terrible psychological working conditions.Can’t this be done by AI?Conditions and overblocking is weird and we need to do something about it. (Yet we don’t have a good idea, how.)

#Work in Progress

•••  ••  •  ••  •••    !  !!

This representation is a work in progress. Of course it’s strongly subjective and certainly not complete. Still  it may be useful as a general overview or as a first hint towards a subject. If you have remarks or interesting links to add, you are very welcome.

In these white filds you find explanations, buzzwords and useful links.

The green and red dots obove indicate the range of chances and dangers involed. The blue hyphens indicate possible ethical conflichts.

In the blue fields, you find conservative, economic liberalist and authoritarian positions.In the green fields, you find liberal, libertarian or progressive positions.

Algorithmic Descision Making (ADM) – 2016

•••  ••  •  ••  •••    !  !!

Responsibility is transferred to algorithms.

Problem: There is no economic incentive for the high effort of a thorough analysis, social participation or compliance with quality criteria.

The Guradian:
How algorithms rule the world

Kirsten Martin, Journal of business ethics:
Ethical Implications and Accountability of Algorithms

Soundcloud, with the example of driving assistence (in German):
• Keynote by Eric Hilgendorf, Frankfurter Zukunfts-Symposium 2016

We must ensure that ethical principles are upheld. We are confident that this can be handled, at least to a reasonable extent.The algorithms are often secret, (especially self-learning) inscrutable and error-prone.

Problems are often not so much about data protection, but too one-dimensional conclusions. The design of the analyses may contain circular conclusions and discriminations that are difficult to prove. Examples: Stock market trading, pricing, job placement, university admissions, lending, insurance premiums, image analysis in medicine, logistics, autonomous driving decisions, predictive policing, weapons technology, privileges for influencers.

Our value system might be slipping away unnoticed.

Ultimately, natural or legal persons must be accountable for automated decisions (e.g., be suable).

Communication moves to private platforms – 2016

•••  ••  •  ••  •••    !  !!

The open protocol e-mail is increasingly being replaced by private chat platforms, SMS replaced by Whatsapp etc.Why should that be a problem?This trend deepens the monopoly position of the suppliers concerned. In the perception of many users it is simply more practical, they hardly think about the basic questions.

The consequence must be a: a demonopolization of the providers by demanding them to open protocols to third party providers.

Election manipulations with hacks and propaganda – 2016

•••  ••  •  ••  •••    !  !!

Unfair influencing of elections is becoming increasingly popular with despots. The spectrum ranges from media control to smear campaigns to downright troll factories. Actors who are only eager for clicks through emotionalising fakes also play a role.

Even voting machines are proving to be vulnerable to hacks surprisingly often.

• Extreme right-wing: We are against the established media. Putin and other autocrats are being demonised. Anyone should decide what to believe.

• Conservatives: Fake news should be prohibited. Election machines are safe and very practical.

This endangers our democracies, if only as a question of credibility. Precaution: media competence, renunciation of voting machines. Beware of censorship.

EU Basic Data Protection Regulation – 2017

•••  ••  •  ••  •••    !  !!

Struggle for the anchoring of extensive rights of self-determination in European legislation:
1. No data processing without consent: data and metadata may only be commercially exploited if the users concerned give their consent
2. Simple protection against online tracking: settings simple and legally binding, tracking walls are prohibited
3. Privacy by default: Do-Not-Track settings ex works
4. Limits for offline tracking. Creation of individual motion profiles is prohibited.
5. Right to encryption: Providers are obliged to protect their users’ communication against unauthorised access according to the latest “state of the art” — explicitly also by means of cryptography. Decryption by third parties is prohibited. An obligation to backdoors is excluded.
6. More transparency about state access

Ongage:
• What is GDPR and why it’s good for you

ICO:
GDPR guidebook 

Data protection is overrated, bureaucratic and hinders progress. Liberal campaign slogan: “Digital first, concerns second.” Making money is a legitimate concern for data collection.We can only be happy if strong data protection is enshrined in law. Some aspects may be annoying. But it is an important foundation of a free and pluralistic society.

Trump rules via Twitter – 2017

•••  ••  •  ••  •••    !  !!

Out of distrust / resentment against established media, US president Trump seeks the unfiltered way into the public.Trumpists: He ist showing the established media the ropes.

Moderate Conservatives: This is going too far, it’s dangerous.

The erratic behaviour of the US president is irresponsible, he is lying whenever he opens his mouth.

Smart home consoles, VPA – 2015

•••  ••  •  ••  •••    !  !!

Amazon Echo, Google HomeWhy not? The devices offer great features, e.g. they can dim the lights and tell you in how many days it is Christmas.The devices are highly problematic. They are data spillers and extend a potential surveillance sphere into private space.

Right-wing and conspiracy theorists penetrate social networks – 2014

•••  ••  •  ••  •••    !  !!

Right-wing extremist’s and conspiracy-theoretical framing becomes socially acceptable.

Targeted disinformation vs. freedom of opinion: Influencing opinions and elections through conspiracy theories, troll factories, Breitbart News, Epoch-Times and many more.

It becomes more difficult for laypersons to judge the credibility of content.

• Fact finding websites help to get more or less unbiased information. E.g. factcheck.org, snopes.com

• Extreme right-wingers see a conspiracy of the liberal press, because they themselves are bombarded with rumours and propaganda in their social media echo chambers.

• The conservative middle class is shocked by the erosion of the rational discourse. Ready to intervene, if necessary, e.g. by deletions. The operators of the social networks shall be held accountable.

• The mechanisms are understood (lack of media competence, selective perception, filter bubble, cognitive distortion effects …) only the antidote is missing. Those who immunize themselves against clarification are difficult to reach.

• Censorship and restriction of freedom of expression are not an option. The operators of social networks must not be punished in such a way that they start to overblock.

• Truth is subjective, facts are not. Beyond the daily news stories, there must be a verifiable and rational yardstick for decisions.

• History shows us: What is considered an error today may be considered true tomorrow. Pluralism must be preserved, but the limits of tolerance must be taken into account with a sense of proportion.

German Network Enforcement Act (Netzwerkdurchsetzungsgesetz) – 2017

•••  ••  •  ••  •••    !  !!

Takes social networks into the obligation to delete “obviously illegal” content.

Minister Maas’ definition of “hate speech” also includes “anyone who publicly or through the dissemination of writings insults the content of the religious or ideological beliefs of others in a way … which is likely to disturb the public peace“.

Wikipedia:
Netzwerk­durchsetzungs­gesetz

Netzpolitik.org (in German):
Russland kopiert Netzwerk­durchsetzungs­gesetz

heise.de (in German):
Rasterfahndung in Vorratsdaten

There must be a control of the published contents (Hatespeech, copyright violations, child pornography …), the social networks must be held liable.• Freedom of expression is too precious a good to be put at risk. Private companies must not be turned into judges and pass legal judgements in advance (over-blocking).

• Algorithms are faulty (irony, satire, humour, art, freedom of expression, provocation) and moderation by people is too expensive for networks. The law is faulty in many aspects.

• Financial institutions are obliged to report suspicious money transfers to the criminal investigation offices. Would that be a blueprint?

• Calls for violence and hate speech are not covered by the right to freedom of expression. The responsibility for public statements must be clear.

Rise of paywalls – 2016

•••  ••  •  ••  •••    !  !!

Newspapers and othe media install paywalls as an urgently needed funding model.Quality has it’s price, even if no one really likes to pay. Quality has it’s price – still we can’t afford to pay everywhere. Paywalls scare people off who should have free access to quality information, leaving them behind in echochambers of social media.

Autonomous driving systems – 2010

•••  ••  •  ••  •••    !  !!

Google Car, Robot taxis

Soundcloud:
Algorithmic Descision Making (ADM)

Great new product.• Very useful as part of an integrated mobility policy.

• There is a lot of ethical work to be done here, because the self-driving systems decide, among other things, about life and death. They could even decide against their owner in order to save others.

• Consider the effects on social systems, the labour market and transport.

• Speed limit 30 or 80 km/h?

Right in informational self-determination versus „Super basic right on security“ – 2012

•••  ••  •  ••  •••    !  !!

The two terms formulate competing ideologies. The former has been legally documented since 1983, but is often ineffective, while the latter has become a reason of state.

Spiegel.de
The Internet is totalitärian (ger)

Our security is the highest concern. It has absolute priority.Our freedom is the highest concern. It has absolute priority.

Pokemon Go – 2015

•••  ••  •  ••  •••    !  !!

Augmented Reality becomes mass effective, interweaving of topography and virtuality.

The Intercept:
Privacy scandal haunts Pokemon Go’s CEO

A ridiculous hype — but at least it makes people get some fresh air.• Interesting: First broadly adopted application of Augmented Reality, gives an impression of what is possible.

• Besides, Pokemon Go was discovered to be a vicious spy malware.

Social Credit Scores – 2017

•••  ••  •  ••  •••    !  !!

Wikipedia
Social Credit System

Netflix, Black Mirror
• Nosedive (dystopic fiction)

Wikipedia
Federated Identities

Ultimately nothing other than a credit protection agency – at least you have to know who you are dealing with.
Enables new incentive systems for social control.
A nightmare! The effects on privacy alone are fatal, not to mention political subservience. Here the way to the totalitarian Orwell state is consistently continued.

New authoritarianism, in Europe, US and elsewhere – 2011

•••  ••  •  ••  •••    !  !!

Belusconi, Putin, Kaczynski, Ukip, Front National, AfD, Erdogan, Orban, Trump, Duterte etc.• Conservatives: We’re worried.

• Authoritarians: Our salvation! Finally someone tells the truth and stands up for our true interests. By the way: Elected is elected.

It turns out that even established democracies are not safe from slipping into authoritarian systems. This is a problem not only with regard to the data technology possibilities of the state.

The democratic state must uncover and prevent the mechanisms of authoritarian influence on social networks and democratic institutions.

Rising productivity – who benefits? – 2013

•••  ••  •  ••  •••    !  !!

Continuing rampant productivity growth leads to a reassessment of work, education, participation and social redistribution.

Through participation and non-participation (capital) in rationalisation, the social divide widens more and more – if we let it go.

Zeit.de:
• On basic income (ger)
• Why doesn’t Google pay us a user fee? (ger)

Performance is still the yardstick – and, honestly, we do perform great!• Is the 20-hour week coming? Or the unconditional basic income?

• Growth through new quality requirements (sustainability, social standards, etc.)

• Growth through new business models and needs (virtual goods such as information or entertainment, social needs, …)

• How do we distribute profits as social as possible?

Face recognition and other biometric procedures<br /> – 2011

•••  ••  •  ••  •••    !  !!

Mass recognition, profile building and search of persons on the basis of biometric data (e.g. face file of 64 million Americans by the FBI, Argus Drone, gait recognition, etc.)

Spiegel.de
Testlauf zur Gesichtserkennung wird verlaengert (German)
Software can recognize homosexuals from Photos (Ger)

Youtube:
• How face ID works
Demonstration video (Ger)

Great opportunities that make life much safer. Finally, we know who does what.Very invasive technology when it comes to person recognition. It may not be used for the causeless creation of motion profiles without judicial permission.

Personalisation, individualisation of products and services – 2010

•••  ••  •  ••  •••    !  !!

Ever better services and new impulses for growth.All well and good – but data protection?

Digital currencies, instant and mobile<br /> payment – 2010

•••  ••  •  ••  •••    !  !!

• Conservative: nice.

• Conspiracy theorists: The Rothschilds and Freemasons want to ban us from cash and then take over world domination.

• I don’t want all my purchases and payments to be accessible to others anywhere.

• It is practical and easy to manage.

– Conservative: chic.
– Conspiracy theorists: The Rothschilds and Freemasons want to ban us from cash and then take over world domination.

Cloud and streamings replace personally owned carriers – 2012

•••  ••  •  ••  •••    !  !!

PCs are delivered without a CD drive, mobile devices are synchronized automatically. Cloud and streaming services are growing.It’s convenient.It’s unacceptable that all our data should go overseas and be sent Cc. towards the NSA. Effectively encrypted solutions and our own servers are used too little.

Vectoring instead of glass fibre – 2012

•••  ••  •  ••  •••    !  !!

With vectoring we achieve an increase in performance with existing infrastructure.This is only conditionally the case. Competition is also being prevented. Instead, the fibre-optic network should be expanded.

Snapchat – 2011

•••  ••  •  ••  •••    !  !!

The idea: short-lived messages that delete themselves

See also “the Internet doesn’t forget”.

Non-durability and forgetfulness free from pressure for self-optimization. Snapchat was in its original consequence an innovative contribution to this.

Snowden revelations,<br /> expansion und legalisation of intelligence practices – 2012

•••  ••  •  ••  •••    !  !!

Western secret services also do everything that is technically feasible, without hesitation.

Michael Hayden: „We kill people based on Metadata.“

Netzpolitik.org:
• Federal intelligence investigation committee – the conclusion (Ger)

• Secret services must be there, and we must not be worse equipped than our opponents.

• It is treason to make our instruments known to our enemies.

• Smear, we keep to the rules, the control works.

• We are the good guys, we are allowed to do that.

• We should have clarified better before, now we are legalising the existing practice.

• Our worst concerns are actually true – and worse. We cannot imagine a bigger breach of trust.
• Everything that is technically feasible is being made. Constitutional rights are broken or circumvented.
• Everything we have ever accused the totalitarian systems of (Nazi-Germany, GDR, China, Iran, Turkey …), we (Five Eyes, FRG, …) do it – only more perfect.
• The secret services form a dangerous state within the state. They have blackmail potential for considerable influence on people and decisions.
• The reactions of the state are pathetic and quite meaningful. There is no hope of clarification and improvement in sight.
• Next to individual precautions, our options are limited – aren’t they?

Big Data – 2012

•••  ••  •  ••  •••    !  !!

„Data is the new Oil.“

Soundcloud:
Frankfurter Zukunftssymposium 2016: Vortrag zum Beispiel Gesundheitswesen/ Krankenversicherung (Ger)

If there’s anything we can do, we do it. People shouldn’t be like that. They are not as interesting as they think as individuals.Big Data exponentiates and points out all the problems we see with data protection. Information plus computing capacity creates control. Detailed profiling of individuals and groups become a decisive instrument of power. There is no reason to entrust it to anyone.

But – what possibilities do we have to prevent this?

Broadband internet<br /> – 2010

•••  ••  •  ••  •••    !  !!

( > 2048 kBit/s )We do need the broadband Internet. Until 2050.Should be available for everyone, comprehensive and worldwide. Germany e.g. is lagging behind, while prices are far above average.

Fake news – 2015

•••  ••  •  ••  •••    !  !!

New term for lies and propaganda. Also called “alternative facts” by the originators.

• Definition: “News in the form of deliberately untrue factual claims … where the falsehood is sufficiently obvious.”

• “alternative” media undercut established standards of reporting

• “Deep Fakes” are even smarter: technical progress enables computer-generated “proof” videos, fake voices etc.

Governments develop a power-strategic relationship to facts:
There is a truth (what we say) and lies (what others say and we don’t like).
• The claims that are made in the times of social media are often unspeakably rediculous. Yet they are not always easy to disprove (burden of proof).

• What does “obvious” mean and who determines what really is “fake news”? Opinions must be distinguished from facts.

• A governmental definition of “truth” suits dictatorships, that can’t be it. The media must also earn their trustworthiness.

Research networks – 2010

•••  ••  •  ••  •••    !  !!

Bellingcat, Vroniplag, Syrian Human Rights Observatory, OpenSchufa …

• Bellingcat.com
Vroniplag
Openschufa

Neutral. In response to attacks: Here the door is opened to lies / denunciation / slander.An opportunity to create networked knowledge. Be prepared!

Data retention – 2007

•••  ••  •  ••  •••    !  !!

Temporary storage of connection data in telecommunication traffic.

Netzpolitik.org:
• Die Entscheidung zur Vorratsdatenspeicherung und ihre Folgen

• Authorities must have access, if not to prevent crime, then at least to solve it.

• The police must be able to investigate criminal gangs. For this purpose, data from the telephone sector is needed. The police cannot do without technology if the mafia uses it.

• When will you finally understand that data retention is a dead horse? Various high court judgements have been made against it. It massively violates basic rights and the principle of informational self-determination.

• State organs are not per se the good guys. Historically, this is even rather the exception to which we have become too accustomed or we are deluding ourselves.

Bitcoin, blockchains – 2010

•••  ••  •  ••  •••    !  !!

Technology for anonymous but verifiable processing of transactions (e.g. money payments) and generation of a technically limited currency.

Cryptokitten: game with “genetically” unique cat pictures derived from each other

Explanatory video
Crypto-Kitten

Skepticism as to whether crypto currencies are stable in value.

Loss of control: Bitcoin is the currency of blackmailers and money launderers.

Crypto currencies are an opportunity to establish a completely new concept of money that works decentrally and avoids many disadvantages of controlled currencies.

Open AI – 2015

•••  ••  •  ••  •••    !  !!

AI-critical project for the development of a secure and generally accessible AI, initiated by Elon Musk, who warns, among other things, of the dangers of a strong AI.

• https://openai.com

• The development of AI must be committed to the common good.

• The actual danger posed by strong AI is assessed differently, with a tendency towards de-dramatisation.

• Weak AI should not be underestimated as a market and power factor. A corrective is welcome here.

Mobile robots – 2008

•••  ••  •  ••  •••    !  !!

Various industries and all of our private lives will change in the long run. The acceptance in the private sphere will increase with tempting offers.This will be the next major growth spurt. And it saves us personnel on a grend scale.Are we all becoming superfluous now? The machines will score with AI – with all their dubious blessings and consequences. How do we establish standards so that this does not turn against us?

Drones (use on a large scale) – 2010

•••  ••  •  ••  •••    !  !!

Unmanned flying vehicles challenge our environments and ethical standards.War drones: Why should we put our boys’ lives at risk?

Other applications: The future will have drone highways.

The drone war is a crime.

Video and delivery drones are interesting.

Spy drones in the size of insects are creepy. (see autonomous weapons)

AI for rent – 2006

•••  ••  •  ••  •••    !  !!

Amazon Web Services, Einstein, Intel Go, IBM Watson u.a.

Manager-Magazin
• KI mieten, kaufen, entwickeln?

How can we manage to not miss the train?See “Open AI“, 2015

Navigation systems – 2005

•••  ••  •  ••  •••    !  !!

Streetview, Google Earth, Nasa Marsmilitary infrastructureHuge progress for individual mobility

Internet of things – 2006

•••  ••  •  ••  •••    !  !!

Smart Home, Industry 4.0

• integrata-stiftung.de/projekte/q-siegel (in German)

Great growth impulses, rationalization, great new products, monitoring and controlFor many supposedly convenient functions, it is not worth giving up their own autonomy. if not, we must ensure that high safety standards are established. Otherwise there is a risk of abuse (by criminals or states) or an independent AI (see singularity).

The internet never forgets – 2010

•••  ••  •  ••  •••    !  !!

Wikipedia:
Streisand Effect
Be careful about what you are posting.It’s not just about a few posts and duckface selfies. Whatever is realeased into the world will never be retrieved. This can have negative effects on everyone (e.g. in combination with Big Data).

Internet blocking and disconnection – 2009

•••  ••  •  ••  •••    !  !!

Blocking particular websites (which are considered harmful) or the entire Internet (e.g. as a means of counterinsurgency)

The Guardian:
Internet shutdowns in Syria

It must be ensured that no unwanted content is delivered. And if things get dicey, you have to switch off completely.Thoughts are free and net barriers are censorship. Therefore they are – even with well-meant intention – all too easily abused for the purpose of maintaining power.

Wikileaks – 2006

•••  ••  •  ••  •••    !  !!

Whistleblower platform with a unique infrastructure

• wikileaks.org

Betrayal of secrets is a damnable crime of profile neurotic freaks. „They put lives at risk.“Responsible whistleblowing has our sympathy and Wikileaks is the leading technical platform here. Meanwhile, Julian Assange seems to have an ambivalent agenda, which is why many people are turning away from him.

The question of the “cui bono” (for who’s benefit?) of revelations and their veracity must always be asked.

Net neutrality – 2006

•••  ••  •  ••  •••    !  !!

Companies want to see their online services on the fast lane, while other will be delivered more slowly.

• See also „Cultural struggle for an open Internet“

It is not understandable why certain services should not extend their offer accordingly – the customer has got benefits from it!Net neutrality is a very important concept for maintaining the openness of the Internet. Once the principle is overturned, the payment services have the upper hand and the transition to censorship becomes seamless.

E-Learning – 2002

•••  ••  •  ••  •••    !  !!

The future of learning is constantly changing. Knowledge becomes less exclusive.
New forms of learning are emerging (microlearning, etc.).
Old elites are developing new mechanisms to differentiate themselves. “It’s not what you know, it’s whom you know.”E-Learning is a prime example of how the Internet contributes to democratisation and emancipation. Although many global problems are accelerating, there is hope that solutions will also be accelerated through the exchange of ideas.

Online search / governmental malware – 2006

•••  ••  •  ••  •••    !  !!

(Arguments see also data retention – 2007)The state must have access – within a given legal framework, of course.If the state profits from weakining teh digital infrastructure, instead of fixing it, it is not acting for the common good. Although the need is clear and a judicial review can be carried out, the process is very problematic and the proportionality is at question.

Liabaility for open wi-fi / ceasing industry – 2004

•••  ••  •  ••  •••    !  !!

Especially in Germany for long it was an unbearable risk to run an open hot-spot. Only in 2018 the liability for possible violations was abolished.Anyone using the internet should have an ID. Only people with dark interests can dislike this idea.Free and unsuveilled access to the internet should be recognised as a basic right. Always and everywhere.. Those few criminals will for sure always find a way. („When privacy is a crime, only criminals have privacy“).

Free-cost mentality II: Google – 1997

•••  ••  •  ••  •••    !  !!

At first a search engine “only”, Google will soon show how to profit from the information it collects.• At first: Great thing, but how are they ever gonna make any money?

•Later: How much does it cost to come first in the search?

• Great thing … and how do I get to the first places in the search?

• Later: Very practical …, but the company has too much power. (Break up Monopolies!)

• People develop too little awareness of what they are revealing with their data.

Augmented Reality – 2005

•••  ••  •  ••  •••    !  !!

Enriched Geodata, Museums, Print-to-3D, Pokemon Go …Crazy … Let’s See what the future brings us.Great opportunities, from virtual learning to online gaming.

Terror and the reactions – 2001

•••  ••  •  ••  •••    !  !!

… here regarded especially as a media phenomenon and as a reason for unfocused mass surveillance

welt.de:

• Interior minister Friedrich declares homeland security to be a “super fundamental right” (German)

• We must not allow ourselves to be unsettled by terrorists and must counter them with the most powerful instruments possible. Private rooms (encryption) must be made comprehensively accessible to the authorities.

• Investigation successes prove us right.

• Attacks are being misused/used as an argument for abolishing the free Internet.

• Uninitiated mass surveillance is a totalitarian instrument. The benefits for the general public are disproportionate to the damage caused by the state’s intervention in individual fundamental rights.

• Internal security is promised as a “super fundamental right”, which is impossible.

• Mass surveillance has proven to be ridiculously ineffective, which interested parties dismantle.

• Everyone is under surveillance except those who are really criminals who know how to evade it.

Social networks und shops:<br /> giving away data – at what expense? – 2004

•••  ••  •  ••  •••    !  !!

Post-privacy = Data exhibitionism, leads to an Internet with “total freedom of information”
(Merkel: wealth of data instead of data economy)
Just let the people do and follow their interests.
Though. personally, this is not for me.
• There is a rising inner need for self-promotion and -optimization.

• With information getting out of hand profiling becomes easy. In social media you get more information about job candidates than in any expensive assessment. At some comtries’ borders it’s already a common request to show your contacts and activities (e.g. books you ordered at Amazon).

Autonomous weapons – 2005

•••  ••  •  ••  •••    !  !!

Deutschlandfunk Kultur:
Deadly autonomous weapons (in German)
• Weapons are neither good nor bad – it depends on who uses them. (… and btw.: We are the good guys!)

• Weapon systems can be equipped with an “ethics module”.

• With a (partly) automatized decisions about life and death a we See a new quality in warefare. An ban as with B- and C-weapons would be indicated.

• A possible convention: Autonomous weapon systems may only be directed against things, not against people.

Media literacy, e-skills – 2003

•••  ••  •  ••  •••    !  !!

e.g.:
Deutsche Bahn vs. Markus Beckedahl
People need to be taught what they encounter on the Internet and how they can for example protect their children from inappropriate content.

Some personal things just don’t belong in public.

Media literacy belongs to the responsible citizen and is the most effective weapon against fake news.

Data protection concerns everyone.

Programming is a cultural technique like reading and writing.

Government transparency / Freedom of Information Act – 2006

•••  ••  •  ••  •••    !  !!

USA: Freedom of Information Act, 1966

• fragdenstaat.de

The subject is annoying and is gladly avoided. If addressed: freedom of information creates bureaucracy and hinders the exercise of power.Protect private data, use public data! It goes without saying that representatives of the people are accountable and that data created with tax money must be available – as is standard practice in the USA, e.g.

Web accessibility for all – 2000

•••  ••  •  ••  •••    !  !!

With the increasing importance of the Internet in everyday life, the interest in letting everyone participate increases.

• Accessible Internet
• Easy language
• Make the Internet available everywhere
• interference liability

Accessibility to handycapped is a good goal, but is not always taken seriously in practice.•Accessibility to handycapped is a good goal, but is not always taken seriously in practice.

• Remember to involve countries that are less industrialised and, for example, to take their transmission standards into account.

! Security breaches in hard- and software – 2000

•••  ••  •  ••  •••    !  !!

Zero-Day-Expoits, “Heartbleed” (long-term vulnerability in SSL, Open-Source-Software), “Spectre” and “Meltdown” (in Intel-Chips), “Wannacry” (4.5 billion US-$ damage) and many more.

Wikipedia:
• NSA-Hintertüren in IT-Hardware
Stuxnet

We are doing everything we can to provide the state with the most powerful instruments possible. Moreover, everyone should take care of their own security.States and secret services must not use security holes for themselves or weaken hard-/software by backdoors. They have to be committed to the integrity of data technology systems.

Open source code – 1987 / 2001

•••  ••  •  ••  •••    !  !!

The idea: Software code is disclosed and can therefore be verified. Volunteers can participate worldwide in optimizing solutions and closing security gaps.

Projects like
• Wikipedia
• opensource.org
 mozilla.org
• creativecommons.org

• For various reasons the big players still get the contracts (administrations using Microsoft etc.) – what’s the problem?

• have no desire to relearn the operation.

Open source is safer, non-commercial, cuts costs, can comply with the demands.

E-books, new music- and movie licensing models - 2000

•••  ••  •  ••  •••    !  !!

Performance must be rewarded.Fine for me, as long as the prices are fair and the authors benefit. (The times of Napster were cool though …)

! Copyright: file-sharing, piracy, re-posting, snippets, plagiarism and memes – 1999

•••  ••  •  ••  •••    !  !!

Filesharing networks (Napster), YouTube, illegal streaming services (Kinox.to), software, unlicensed images and fonts, trademark rights, patents, Google News and much more.

• Piratenpartei: Urheberrechts-Dialog

Rights and patents must be respected without ifs and buts, otherwise there will be a legal warning. Internet companies must monitor the uploads of content accordingly.

Only copyright law creates incentives to create intellectual values at all. At some point the Chinese will understand that.

• Ideas and content shall flow freely, that was once the purpose of the Internet. And by the way – the unconditional basic income is coming soon.

• Many creative people refuse to share their laboriously developed content with everyone free of charge.

• Academic cheaters that plagiarized with their dissertation  are being uncovered with enthusiasm.

Freedom of press for journalists and bloggers, protection of sources – 2003

•••  ••  •  ••  •••    !  !!

Press is commonly acknowledged as the “4th power”. Yet, some despots strongly try to shape public opinion. Bloggers are becoming a tribe on their own, with some considered to be almost-journalists, while others are treated as bad as any critical opposition.• The press must adhere to its standards (e.g. secrecy) and should not be too much of a drag to the rulers.

• Sources that betray secrets should be disclosed.

• Bloggers are private individuals and are not subject to special protection.

Freedom of the press and source protection are key to a functioning, pluralistic civil society. They must also apply to bloggers who write and research on their own behalf. Even error is to be tolerated. Fake news must be refuted with objective arguments and the interests behind it uncovered.

Hackers – 1980

•••  ••  •  ••  •••    !  !!

Hackers make the Internet insecure and have to be hunted down. The Internet should become a clean place.• Heroes: Without hackers we would be at the mercy of corporations.

• Criminals: an annoying evil, but that’s the way the world is.

! Ban on encryption<br /> or encryption as default? – 1998

•••  ••  •  ••  •••    !  !!

“Hausmanisation”: The attempt to make the Internet controllable (after the creator of the Paris road network, which offered more control to the police, among others)

How much surveillance do we concede to a state?

• The Guradian:
Cameron wants to ban encryption
• EFF:
There is no middle ground on encryption
• Android Authority:
What to do when US law enforcement asks for your password

The state must have comprehensive access. We must not allow terrorists, criminals and other enemies to communicate undisturbed. Providers and providers must be held accountable and cooperate.• Encryption is the technical equivalent of the secrecy of correspondence and the fundamental right to freedom of expression. Any uncertainty immediately triggers an internal censorship.

• Phil Zimmermann: “When encryption is outlawed, only outlaws will have encryption”

• Law enforcement authorities are not automatically the good ones (Inquisition, Gestapo, Stasi, Sharia police), in Germany and globally this is historically even the exception.

• Our standards must therefore be applicable worldwide in order to e.g. to protect and support NGOs, dissidents, humanists, dissenters …

!<br /> Redefining customer relationships – 1999

•••  ••  •  ••  •••    !  !!

The Cluetrain-ManifestoInsist on linear and hierarchical communication channels. (Sender > Receiver, “What’s the point? Print this out!”)Customer communication will never again be a one-way street, consumers will gain power.

Safe Harbor / Privacy Shield – 2000

•••  ••  •  ••  •••    !  !!

Agreements with the USA on data protection standardsThere you go – everything’s fine.Fake, our rights are being widely disregarded. Data should not go to the USA at all.

Free-cost mentality III: Creative commons – 1998

•••  ••   ••  •••    !  !!

Photographers, musicians, 3D designers and many others make their works available under CC licences – according to taste for non-commercial projects, against naming, etc.

• Public domain, royalty-free content
• Unsplash
• Wikimedia.org

In the sense of cost optimization always welcome – but we oursselves insist on patents, rights etc.• Against increasing self- and overexploitation in creative professions

• Everything should belong to everyone – and this is a great step towards this aim.

TOR-network, darknet – 2002

•••  ••  •  ••  •••    !  !!

allow anonymous use of the Internet or bypassing of public serversA hoard of evil – fight it!For dissidents in authoritarian states, but also here, free spaces can be vital (protection from harassment, harassment, violence), even if they can be used by fiends. Freedom of expression with clear names cannot be afforded by everyone!

! Power and (im-)morality of the internet corporations – 1998

•••  ••   ••  •••    !  !!

New mentalities and business models in the New Economy, “Don’t be evil” (Google), “Think different” (Apple)Skepticism (dotcom bubble) and amazement, but also enthusiasm. Hopes for growth impulses, rationalisation effects and industry 4.0.

“The Internet is still uncharted territory (‘Neuland’) for all of us.” (Angela Merkel, 2013)

Openness to the new possibilities, mentalities and concepts, but also scepticism towards the breathtaking concentration of power of the market leaders.

! Cultural struggle for an open web – 1992

•••  ••  •  ••  •••    !  !!

Restrictions on freedom of expression, the confidentiality of communications and net neutrality contradict the original credo of the early years of the Internet.

• Accessnow.org: Eight steps towards an open Internet (PDF)

The Internet has potential as a growth engine. However, uncontrolled growth must be prevented. Priority must be given to economic interests and the security responsibility of the state.The Internet can make a decisive contribution to the dissemination of ideas, the democratisation of societies and the efficient management of resources. It must be open, uncensored and neutral as a medium.

The Internet goes mainstream – 1991

•••  ••   ••  •••    !  !!

The new concepts (interactivity, global availability, hyperlinks instead of linear thinking, e-mail, newsgroups, …) are only slowly understood by many. But the effects are lasting and extend into the most private areas (partner search, games, friendships, finances, health …).

• The Guardian: How the internet was invented

This so-called Internet is highly overrated.Many utopias are forged (global village, education for all, transparent democracy, effective administration) and partly implemented.

Free cost mentality I: Microsoft – 1985

•••  ••   ••  •••    !  !!

Microsoft aggressively conquers market with crappy free applications.What pleases, prevails.The most common solutions are far from always the best. Alternative: Open Source or, at least, quality software.

Industries in transformation,<br /> with unpredictable consequences until today – 1995

•••  ••  •  ••  •••    !  !!

Concerns music and film industry, book trade, the press, job search, retail trade, banks, travel and tourism, taxi industry, restaurants, education, authorities and many more …Interest driven positions from skepticism to euphoria. Often little imagination as to how a change can be tackled.Openness to new ideas on the part of some, scepticism and resistance on the part of the “trade union wing”. Lifelong learning and flexibility become new virtues, sometimes even self-exploitation. New habits (killer apps) sometimes tear down very quickly.

“Volkszählung”, national census in Germany – 1987

•••  ••  •  ••  •••    !  !!

Privacy rights vs. legally enforced delivery of private informationThe benefits outweigh the concerns. The die-hard traditionalists are always against everything.• Fight the beginnings! The state is not to be trusted, here it concerns among other things corporate and power interests.

• Sociological data are needed for development goals.

Dragnet investigation, targeted profiling, microtargeting – 1980 / 2012

•••  ••  •  ••  •••    !  !!

Targeted addressing of certain groups of people (e.g. in a search or in election campaigns) according to characteristics. Enhanced by large data collections such as social networks or Google profiles.Permitted is what is useful.Problematic, especially if data is evaluated without consent.

Whistleblowing, purchase of classified data – 1960

•••  ••   ••  •••    !  !!

Spiegel affair, Pentagon papers, Watergate, Snowden, tax CDs, doping, and much more.The data is stolen, and we have nothing to do with this.Whistleblowers deserve protection, because they open a corrective from fatal wrong developments. The illegal aspects are usually completely neglected compared to the benefits.

Enforced globalisation, trade agreements – 1986<br />

•••  ••  •  ••  •••    !  !!

WTO, TTIP, CETA, common markets etc.New opportunities are opening up for the economy. This is in favor of the common good.Could be good for emerging markets, but only if trade is fair. Unilateral overreaching and special courts threaten the common good.

Cyberpunk – 1980

•••  ••  •  ••  •••    !  !!

Dystopic literature genreWeird stuff.Interesting …

! Transhumanism: Cyborgs, neuro-implants and enhanced intelligence – 1960

•••  ••  •  ••  •••    !  !!

Augmented intelligence, implanted tools, brain scan & Co.

Soundcloud: Lecture by Stefan Lorenz, Zukunftssymposion Stuttgart 2016 (in German)

Fears of a dissolution of the traditional idea of the person or soul are awakened. However, experiments are being made, driven by economy.Rarely undivided euphoria, some fears of optimisation or superiority of a new caste. Nevertheless, the human-technology transition has long been blurry. What criteria do we have for evaluation? And who is even asking?

Digital avantgarde, data dandies, bloggers, start-ups – 1990

•••  ••  •  ••  •••    !  !!

Suspicious persons• A new lifestyle, new opportunities, freedom

• Distrust from the side of traditional left-wing and unions

! Artificial Intelligence (AI) – 1960

•••  ••  •  ••  •••    !  !!

Deep learning: self-learning systems write code that people can soon no longer understand.

Learning systems have a big future (1997: DeepBlue, 2016 AlphaGo, 2022 ChatGPT, Stable Diffusion). The increase in computing capacity and information density contribute to logarithmic development of the AI over time. Whether and what threatens us in the event of a final “singularity” is still the subject of debate.

Problem: There is no economic incentive to maintain a certain level of quality criteria. Fierce competition punishes safety thinking.

• Ray Kurzweil

• According to the conservative-Christian view, which emanates from a soul, it is difficult to imagine artificial intelligence, personality or even consciousness. The spark of human intelligence is missing.

• An increase in productivity is gladly accepted, otherwise the world may remain as it is.

• Fascination and horror face each other: If consciousness is a question of computing power and learning, then it could also arise artificially. Which properties and laws should it be subject to? Is our fixation on the human a chauvinism or is it legitimate? Who controls the exponential increase in power through AI?

• Warners: Precautions must be taken to prevent a hostile takeover by machines that is harmful to humans (singularity).

• Defeatists: The takeover of world domination (possibly on a cosmic scale) by machines is only a question of time. Mankind is just an intermediate step in evolution, which will continue on the basis of silicon.

• Sceptics: AI is limited, it is not capable of really creative achievements or feelings. In contrast to us, things and it’s own existence means nothing to the AI, that has not been programmed explicitly into it.

Future studies: Which technology will prevail? – 1950

•••  ••  •  ••  •••    !  !!

Futuristic movie from 1967, with some interesting predictions:
The Year 1999 A.D.
We need standards and planning security.

Controversies in industry are often carried out robustly.

On the one hand, consumers are at the mercy of the developing companies, but on the other hand they also decide on success.

Cartels are undesirable, but functioning standards are very welcome.

Personal Computers, PC – 1976

•••  ••  •  ••  •••    !  !!

Megatrends such as miniaturization, increasing performance and easier use of computers are leading to the spread of PCs. Graphical user interfaces and increasingly mobile devices continue this trend.Increase in productivityTools for creativity

CAD, databases, desktop publishing etc. – 1978

•••  ••  •  ••  •••    !  !!

Computer-based systems found their way into normal occupations and turned the working world of the people concerned upside down. Older employees often felt overwhelmed by the new methods.Everything that brings efficiency was welcomed, possible negative effects were negated.Initially, technology-sceptical and euphoric positions were opposed to each other. Many regarded computers as inhumane and a threat to employment. This phobia offered Apple the starting point for personal computers that could be operated more intuitively and ergonomically.

Laws of robotics – 1942

•••  ••  •  ••  •••    !  !!

Isaac Asimov et al.Let’s simply do things and see where it leads us!The laws have the right intention, but are being disregarded. See weapon systems, driving assistance, AI etc.

Dystopies II: Incapacitation through dependency on technology – 1968

•••  ••  •  ••  •••    !  !!

2001 A Space Odyssey, Blackout (Elsberg), Black Mirror and many others• No worries, we can handle that.

• Everything is getting worse!

• Our dependence is already immense and presents us with very practical questions. We must achieve higher security standards in order not to risk a complete breach of civilisation (e.g. concerning archives) in the worst case.

• It is possible that the transition to “machine domination” will be creeping and unnoticed by us.

RFID – 1948

•••  ••  •  ••  •••    !  !!

Radio-Frequency Identification, technology for transmitter-receiver systems for the automatic and contactless identification and localisation of objects (badges …) or living beings (implant chips).Enormous rationalisation effect welcomedThere were protests against the use of this technology, because one can be spied out with it (e.g. as a consumer).

Artificial neuronal networks – 1943

•••  ••  •  ••  •••    !  !!

Warren McCulloch and Walter Pitts describe connections of elementary units with which practically any logical or arithmetic function could be calculated. In 1947, they pointed out that such a network could be used, for example, for spatial pattern recognition.Graphic pattern recognition by AI offers many opportunities today (famous example: more reliable analysis of X-ray images than by doctors). This also results in various problems. See also Facial Recognition, 2011.

Virtual reality, virtual embodyment – 1956

•••  ••   ••  •••    !  !!

Computers are increasingly simulating real-world environments. Everything is possible, from walk-in architectural models to telemedicine and role-playing games. In extreme cases it is possible to go beyond the simulated presence to full identification with an avatar (embodyment, immersion).

• Techradar:
Beyond haptics – blurring the line

The creation of completely virtual environments creates distrust, the traditional “ordered” world view feels questioned. Computer games are met with scepticism – Europe reacts phobic to representations of violence, the USA to everything sexual.A broad spectrum from harsh rejection to enthusiastic approval. Where one fears a loss of reality, the other simply sees fun, pop culture or fantastic possibilities.

Dystopies I: Incapacitation through totalitarian control – 1932

•••  ••  •  ••  •••    !  !!

Brave New World, 1984 (Orwell), Minority Report, Black Mirror and many othersIt’s not gonna come that bad.Just imagine. Experience proves that man is capable of the worst – and so will be technology one day.

! Technology inspires ... but shall not be used against us – ca. 1700

•••  ••  •  ••  •••    !  !!

See also: Swing RiotsAfter initial scepticism, interesting products, markets and convenient functions are gladly taken up. As long as the traditional economy and the state are not endangered, everything will be fine.A “life like 1980, only with iPhone” is a deceptive longing. Some interests of the economy and the state stand in the way of our fundamental right to privacy. The liberal society must be defended by all means – but unfortunately these means are very limited.

Since experience has shown that everything that arises from technological development is used, it is important to find ways of stopping or restricting development worldwide (e.g. in the case of nuclear weapons).

We have no effective access to the progress of technological progress. The focus is on the analysis of benefits and risks.

Legend

Problems <> Opportunities

•••  ••   ••  •••

Ethical conflict potential

!  !!