Is technology hurting our intelligence?

Technology

We’ve all heard the stories of cars being driven into bodies of water because the driver trusted the navigation system. Could technology be making us less intelligent? Trust in tech is the topic of discussion for Arizona State University professor Katina Michael from the School for the Future of Innovation in Society and Alex Halavais, an associate professor in the School of Social and Behavioral Sciences at ASU.

In this segment:

Katina Michael, Arizona State University professor from the School for the Future of Innovation in Society; Alex Halavais, an associate professor in the School of Social and Behavioral Sciences at ASU

Source: Tech episode: https://azpbs.org/horizon/2019/08/is-technology-hurting-our-intelligence/

Source: Whole episode: https://www.pbs.org/video/8-14-19-stock-market-technology-slavery-ze40pa/

Citation: Katina Michael and Alex Halavais with Ted Simons, August 14, 2019, “Is technology hurting our intelligence?”, Arizona Horizon, PBS: Channel 8, https://azpbs.org/horizon/2019/08/is-technology-hurting-our-intelligence/

Data Expert Warns Encryption Laws could have Catastrophic Outcomes

encryption.jpg

A University of Wollongong data expert has labeled the government's proposed encryption laws delusional and warns they could have catastrophic consequences.

The changes would force technology companies to help police access encrypted messages.

Professor Katina Michael, from the School of Computing and Information Technology says the powers are unprecedented and have no oversight.

She is speaking to ABC reporter Kelly Fuller.

Citation: Katina Michael with Kelly Fuller, “Rushed Encryption Laws Herald a Watering Down in National Security”, ABC Illawarra: Radio, 6 December 2018, https://soundcloud.com/kelfuller/data-expert-warns-encryption-laws-could-have-catastrophic-outcomes

Microchipping Employees and Potential Workplace Surveillance

fortune.jpg

British companies are planning to implant staff with microchips to improve security. Sputnik spoke about it to Katina Michael, professor of the Faculty of Engineering and Information Sciences at the University of Wollongong.

Sputnik: Could companies sell employees' personal data to third parties?

Katina Michael: The first thing to know is that before an employer considers selling implant discrete data to a third party, they would likely use it to monitor their staff. For example, for physical access control, the way staff congregate to exchange ideas, how often they use the restroom, how fast they may be finishing and completing some tasks. It is not to say that that would occur, but quite possibly it would be used as a timestamp device. In comparison, today we commonly find facial recognition or fingerprint recognition allows employees to log their time at work.

But a company now can use this technology to introspectively look at what employees are doing. I mean, we can consider employers today gathering data on their employees by using smartphones: I know a lot of companies sign off an agreement when they do offer their employees a company-sponsored smartphone, identifying that they may well log their locations and time based on the company smartphone. Otherwise, I don't believe that a corporation would sell that information.

Sputnik: But if companies were to sell personal data to third parties, what could employees do to prevent that from happening?

Katina Michael: Employees would not be able to block the distribution of data gathered from their implantable devices, unless they've signed some legal agreement not allowing consent to occur or through local workplace surveillance laws. And so they can block the corporation from sharing that information with other companies, such as health insurance providers.

Sputnik: Could employers know if staff contacted a competitor about a job?

Katina Michael: You have to consider that the diffusion of the implants is only a couple hundred people, for example, in the UK, and many of them are not in the employment context. In one case there was an implant device granted to someone with a systematic technology need, an amputee; and when we look at these more widely in the world we could say that probably a few thousand people at most, who are hobbyists to get an implant because they are infused by technology and progress, and being able to automate certain aspects of their life.

I don't believe that, for the time being, information would be provided when one implantee meets another implantee, because of the limitations of the mutual communication and the radio frequency identification being used in that technology. These technologies don't act like smartphones; for the time being the devices are proximity devices that require you to be no more than ten centimeters away from a reader.

Citation: Katina Michael and Laurie Timmers, 2018, “Businesses to Microchip Employees 'to Monitor' Staff”, Sputnik International News, https://sputniknews.com/analysis/201811121069747561-business-microchip--monitor-staff/

Tech for Good: The Role of ICT in Achieving the SDGs

What opportunities and challenges do digital technologies present for the development of our society?

https://vimeo.com/288621991

https://vimeo.com/288621991

I truly believe that we can harness technology for good. That information and communication technology is key to achieving the Sustainable Development Goals. But more than this? We need to be human. Being human means that we can achieve anything together through compassion, care, foresight, and long-term sustainability. Right now we use technology in ways that helps us to gain access to critical information, but also as a means to become more engrossed in ourselves and our personal interests alone. What about the public interest? What about public interest technologies like those being suggested by the SDG Academy an all of its speakers? Think on doing this rewarding course. It takes a mission critical view of how technology can be used (or abused) as a tool for dis(empowerment). We have a choice- from our perspective the choice is easy- we MUST use technology for good.

The trailer for the magnificent SDG Academy. Here are courses delivered by the SDG Academy. More about the free online courses here.

My involvement was in 3 MOOCS related to: privacy, data rights, security and ethics, with a heavy emphasis on human rights throughout. Stay tuned for more.

About this course

Tech for Good was developed by UNESCO and Cetic.br/NIC.br, the Brazilian Network Information Center’s Regional Center for Studies on the Development of the Information Society. It brings together thought leaders and changemakers in the fields of information and communication technologies (ICT) and sustainable development to show how digital technologies are empowering billions of people around the world by providing access to education, healthcare, banking, and government services; and how “big data” is being used to inform smarter, evidence-based policies to improve people’s lives in fundamental ways.

It also addresses the new challenges that technology can introduce, such as privacy, data management, risks to cybersecurity, e-waste, and the widening of social divides. Ultimately, Tech for Good looks at the ways in which stakeholders are coming together to answer big questions about what our future will look like in a hyper-digitized world.

This course is for:

Technology specialists who want to understand more about how ICT is being used to improve people’s lives around the world.
Sustainable development practitioners who need to understand the opportunities and limitations of technology in a development context.
Advanced undergraduates and graduate students interested in the key concepts and practices of this exciting and ever-changing field.

What you'll learn

  • ICT can improve access to knowledge and services, promote transparency, and encourage collaboration

  • Responsible collection and use of data requires governance, security, and trust

  • ICT projects should be contextualized and inclusive

  • Technology is not neutral! Be aware of bias in design and implementation

 Hide Course Syllabus

Course Syllabus

Module 1: Welcome to the Digital Age

  • Introduction to the Course

  • Bridging the Digital Divide

  • Three Approaches to ICT for the SDGs

Module 2: Technology for Governments and Citizens

  • Equity and Access to Services

  • User-Driven Public Administration

  • It's All About the Data

  • The Open Government Approach

  • Case Study: Aadhaar in India

  • The Challenges of Digital Government

Module 3: ICT Infrastructure

  • Enabling ICT: The Role of Infrastructure

  • Promoting Digital Inclusivity

  • Innovations in Infrastructure

  • Building Smart Sustainable Cities

  • ICT as Infrastructure: A Look at Societal Platforms

Module 4: ICT Innovations in Health

  • Achieving Universal Health Coverage

  • Improving Healthcare Delivery

  • Involving the Community

  • Evidence in Action: Success Stories of ICT and Health

  • Emerging Challenges and Opportunities

Module 5: Learning in Knowledge Societies

  • The Ecosystem of ICT for Education

  • Education for a Connected World

  • Sharing Knowledge: ICT, Openness, and Inclusion

  • Measuring ICT and Education: Frameworks

  • Measuring ICT and Education: Data and Indicators

  • Rethinking ICT for Education Policies

Module 6: Promoting Financial Inclusion

  • An Introduction to Financial Services

  • The Potential of Digital Platforms

  • Mobile Payments for Marginalized Communities

  • ICT for Enabling Access to Credit

  • Replacing the Cash Economy

  • The Challenges of ICT-enabled Financial Inclusion

Module 7: Measurement and Metrics

  • Managing Data for the SDGs

  • ICT Innovation for Statistical Development

  • Engaging with Data: Communications and Citizen Empowerment

  • Case Study: Brazil’s Cetic.br

  • Measuring ICT

  • ICT for Monitoring the SDGs

  • Limitations of ICT for Monitoring the SDGs

Module 8: Artificial Intelligence

  • An Introduction to Artificial Intelligence

  • Who Drives the Agenda on “AI for Good”?

  • Implications for Discrimination and Exclusion

  • The Human Side of AI: Risks and Ethics

Module 9: Concerns for our Digital Future

  • Privacy and the Importance of Trust

  • Knowing your Data Rights

  • Cybersecurity

  • The Downsides of Digital

Module 10: The Way Forward

  • The New Workforce: Six Points about the Future of Work

  • The Meaning of Work in the Digital Era

  • The Open Movement

  • Closing Thoughts on ICT for the SDGs

Original link here: https://www.edx.org/course/tech-for-good-the-role-of-ict-in-achieving-the-sdgs

Consumer Digital Touchpoints Online: It's messy

I asked everyone from Facebook to data brokers to Stan for my information. It got messy

It is almost impossible to understand your full Facebook data footprint. (Credit: ABC) 

It is almost impossible to understand your full Facebook data footprint. (Credit: ABC) 

28 April 2018

By technology reporter Ariel Bogle

Brands I've never heard of have my details.

Deciphering your Facebook data can be like leafing through a corporate-owned teen diary.

In 2007, one of my first comments was telling a friend she had a "fashionable mullet", but my online data footprint has exploded since then.

I downloaded my data from Facebook in an effort to understand how brands target me with personalised advertising — an activity that accounted for 98 per cent of the social giant's 2017 revenue.

Your name, age and location are the least of it. Every like, link and interaction can add to your profile, whether it's an inferred political preference — are you liberal or conservative? — or an interest in board games.

But as Wired has detailed, Facebook's data download provides an incomplete picture.

To fix that, I asked for my personal data (you can too, thanks to the Privacy Act) from everyone from data brokers to advertisers.

What did I find? That understanding who knows what about you online is a sisyphean undertaking. One that takes dozens of emails and almost one month.

What do data brokers know?

Ever heard of a data broker? If you haven't, that's no mistake.

"They rarely have a public presence," said Sacha Molitorisz, a digital privacy researcher at the University of Technology Sydney.

"My guess is there is an intuition somewhere there, that what they're doing might not be palatable to customers."

Data brokers are companies that may gather online and offline information — census data, surveys and purchase histories, for example — to create consumer profiles that they serve to advertisers.

In the market for a new car? An expectant mother? These are the types of insights they look for.

If advertisers want to reach these people, they can source special audience information from data brokers and target ads to them on Facebook.

This is allowed under Facebook's Partner Categories program, but after the Cambridge Analytica scandal, the company said it would be winding the option down.

A Facebook spokesperson said ad campaigns run this way would end by October 1, 2018.

For now, though, Facebook works with three providers in Australia: Quantium, Axciom and Experian.

I contacted all three and asked for my personal data. All three said they had nothing — but that's not the whole story.

How am I targeted?

Earlier this year I was served a Facebook ad for 100% Pure New Zealand. Facebook told me it was based on a dataset provided by the data analytics firm Quantium.

But if Quantium doesn't have my personal details, how does it target me?

The tourism ad was sent to two consumer segments — "outdoor enthusiasts" and "travellers" — a Quantium spokesperson said.

The company received de-identified purchase data, likely from Woolworths Rewards program, which was then used to create anonymous groups likely to purchase something based on their past shopping behaviour.

My de-identified data was probably in there. Then, apparently, Quantium matched it up with my de-identified data from Facebook.

"Publishers like Facebook de-identify their users' personal data utilising the same encryption algorithm used by Quantium," the Quantium spokesperson said.

"The de-identified data from both parties is passed into a secured anonymisation zone for matching purposes. This allows the two datasets to be matched without using any personal information."

In some cases, it gets more mysterious.

In Settings, Facebook lists the advertisers it says are running ads, using contact lists they uploaded to the platform.

Experian said it had no personal information about me, but Experian Data Quality is listed as having uploaded my contact information to Facebook.

A company spokesperson said it could not confirm why I was connected to Experian Data Quality.

"Based on the information you provided to us, we again confirm that Experian's Data Quality and Targeting (Marketing Services) in A/NZ does not hold any personal information on you," she wrote in an email.

Who else has your email?

Brands are only meant to upload contact lists to Facebook for advertising if they have permission to do so.

In the case of the video streaming service Stan, seeing its ad on Facebook made sense — I'm a subscriber, and apparently, I've watched the TV show Billions.

A Stan spokesperson said the ad I saw was intended to remind people "who may be fans of the show" that a new season was available.

It does this to highlight content the company thinks subscribers are interested in, using its internal analytics.

"We matched your encrypted email to data held by Facebook to facilitate the surfacing of that content," she added.

(I also asked for all my personal data from Stan, and the hours of television I've watched makes for a terrifying spreadsheet, by the way.)

 

The contact list mystery

But Stan is not the only brand that has my information.

As I write this article, there are more than 300 brands that Facebook lists as having my contact information — the majority of which I've never heard of.

There's a sushi restaurant in Perth, for example, called Tao Café. I've never visited.

I got in touch, and Tao Café office manager Annette Sparks was equally baffled about its appearance on my list.

But she said that the food delivery company Deliveroo ran ads on behalf of the company, and suggested that's how my contact details may have been bound up with the sushi venue.

So, onto Deliveroo.

While they couldn't discuss my personal situation, a spokesperson said Deliveroo does provide "marketing support" to its restaurant partners — essentially, it runs ads promoting them as part of the delivery service.

Did Deliveroo then share my email with cafes from Perth to Singapore? The company said no.

"Under no circumstances does Deliveroo share any customer details with restaurants or other third parties as part of these marketing campaigns," the spokesperson said.

I'm left none the wiser about why Tao Café was on the list — and there are other mysteries too.

According to Facebook's list, various American political candidates have my contact information.

As does the official Facebook page of the actress Kate Hudson.

What can I do?

Mark Zuckerberg has said Facebook users own their data, but it's an unusual kind of ownership.

Ownership feels largely meaningless when your data is scattered around the internet.

There is no one company to blame. The architecture of online advertising is set up this way.

"The issue is that in the digital space … personal data is very much sought after, and there are all [kinds of] different players who stand to benefit from access to that data," Mr Molitorisz said.

"There needs to be greater transparency with how our data is used."

This is the reality of surveillance capitalism, according to Professor Katina Michael, a privacy expert at the University of Wollongong.

Our data is a valuable commodity, and time is not on our side when it comes to understanding who wants it and where it's going.

"We don't measure it, we don't write it down like we do calorie-controlled diets," Professor Michael said. 

"We don't realise how much we're giving away."

Ariel Bogle, April 28, 2018, "I asked everyone from Facebook to data brokers to Stan for my information. It got messy", ABC Radio Nationalhttp://www.radioaustralia.net.au/international/2018-04-28/i-asked-everyone-from-facebook-to-data-brokers-to-stan-for-my-information-it-got-messy/1752610

Biometric data from driver's licences added to government database

your face used to track you by government.jpg

Your face is becoming the latest weapon in the world of digital surveillance, and the humble driver's licence looms as a game-changer in tracking individuals through both the real and virtual world.

Experts warn your biometric data may already be vulnerable to misuse by criminals and terrorists, as the proliferation of mobile cameras combined with social media and ubiquitous CCTV feeds mean we're caught on screen more than ever before.

Key points

  • Biometric data builds an online profile using your photo, age and address
  • This can then be matched against photos gathered from the internet or CCTV
  • The data can be used by government agencies, along with companies and criminals

Driver's licences will be added to the Commonwealth Government's already vast biometric databases after it struck an agreement with the states and territories, handing authorities access to an unprecedented level of information about citizens.

A system known as "the interoperability Hub" is already in place in Australia, allowing agencies to take an image from CCTV and other media and run it against a national database of passport pictures of Australian citizens — a process known as "The Capability".

But soon driver's licences will be added to the system, allowing both government and private entities to access your photo, age and address.

It is a $21 million system being sold as a way to tackle terrorism and make commercial services more secure.

But experts warn people now risk losing control of their biometric identity entirely as commercial interests, governments and organised crime gangs all move to capture more personal metadata for their own gain.

Driver's licences change the biometric game

Technology and legal expert Professor Katina Michael said about 50 per cent of the population already had some kind of visual biometric stored in a nationally-accessible database, but the inclusion of drivers licenses would see the proportion of Australians scooped up in the net swell to about 80 per cent.

She said one of the biggest risks of the collection of biometric data was not deliberate misuse by the AFP, ASIO or another government agency, but rather vulnerabilities in the way biometrics work.

Who can access your biometric data?

Document Verification Service (DVS) - government and private sector

  • Companies and government can run an identity document through a database to see if it matches information held on file, and that the document has not been revoked
  • Individual must consent before DVS used

Face Verification Service (FVS) - government and private sector

  • Enables a facial image of an individual to be compared against government records of that same individual, such as passports and drivers licences
  • Individual must consent or a legislative basis must be established to collect the information, and use must comply with the Privacy Act

Face Identification Service (FIS) - only law enforcement agencies can use

  • A facial image can be compared against multiple facial images held on a government database, including Australian citizens' passport photos and now driver's licences.
  • Multiple records of people who have a close match to the image are usually returned
  • An agency must have a legislative basis or authority to collect and use the information
  • Access is restricted to law enforcement agencies or those with national security related functions

"It's not like a one-on-one match, where you put (in) an individual's face and say: 'they're a suspect'," Professor Michael said.

"But rather what you get returned is a number of possibilities … you might get back 15, or 20, or 30, or 50 matches.

So you might have 50 innocent people being suspects, rather than the person that you're trying to catch

Professor Michael said this meant that while over time a person's name might be cleared, their data could remain in a database linked to a criminal investigation.

"And then I'm thinking, what happens to their level of innocence as time goes on, because they accidentally look like a minority group?" she said.

She said real criminals and terrorists would opt out of the system, choosing not to have passports and driver's licenses in a bid to escape the net.

"Of course, if you've done nothing wrong, the old adage says you're fine. But increasingly, we don't know if we're fine," she said.

The rise of 'uberveillance'

Professor Michael said modern surveillance methods employed by law enforcement were not just limited to CCTV — they now incorporated vast amounts of metadata and social media, leading to a concept known as "uberveillance" in which people were constantly monitored.

"What we have now are digital footprints that we all leave behind," she said.

"Phone call records, internet searches, credit cards and even the data on your electronic train or bus ticket can be used to track your movements and activity.

"It brings together all these various touchpoints, telecommunications records, travel data via tokens, facial recognition on federal databases, your tax file number … that's accessible depending on the level of crime and social media.

"You've got this very rich almost cradle-to-grave kind of data set that's following you."

We asked if you were concerned about driver's licenses being added to a biometric database.

 

Organised criminals want your identity

Stephen Wilson runs Lockstep Consulting, a Sydney-based firm which researches and tracks trends in biometrics in the corporate and government spheres, and advises clients on best-practice.

He said at the moment very secure biometric systems took quite a long time to process images accurately.

Problems arose when consumer convenience, such as being able to unlock a phone or access a bank account with a quick face or fingerprint scan, trumped security.

"No police force, no public service, no business is ever perfect, there is always going to be corrupt people," Mr Wilson said.

"The more exposure we have to electronic databases, the more exposure we have to biometric matching, it's only a matter of time before these bad actors succumb to temptation or they succumb to corruption and they wind up using these systems inappropriately."

Your biometric twin is out there

VIDEO: Professor says nothing to fear from federal driver's license database (ABC News)

Mr Wilson said biometrics were creeping into consumer services like bank accounts and online betting facilities, with customers asked to send a picture of their licence and a "selfie" that will be run through an identity matching service.

"The real risk is that bad actors will take people's photos, ask for a match, and get back a series of matches of people that are kind of like your biometric twin," he said.

"We've all got doppelgangers, we've all got people in public that look just like us.

"If you're trying to perpetrate a crime, if you're organised crime, and you're trying for example to produce a fake driver's licence, it's absolute gold for you to be able to come up with a list of photos that look like 'Steve Wilson'."

Technology companies like Apple and Samsung have championed the use of biometrics such as fingerprints, and this has taken a step further with facial recognition becoming more common thanks to the release of the iPhone X.

PHOTO: Apple's iPhone X has championed facial recognition technology. (Twitter: AppleEventos)

However Mr Wilson said a key difference was that information stayed on the phone, while banking and other commercial interests trying to use your biometrics to confirm your identity could be storing it on a server anywhere.

"Do you really want your photo, which is a pretty precious resource, sent off to a company perhaps on the other side of the world just so you can get a quick bank account or quick betting service set up?" he asked.

What will happen next?

An annual industry survey conducted by the Biometrics Institute, known as the Industry Trend Tracker, has nominated facial recognition as the biometric trend most likely to increase over the next few years.

Respondents believed privacy and data protection concerns were the biggest constraint on the market, followed by poor knowledge of decision makers, misinformation about biometrics and opposition from privacy advocates.

The Australian law reform commission says biometric systems increasingly are being used or contemplated by organisations, including in methadone programs, taxi booking services, ATMs and online banking, and access to buildings

Dr Michael said governments needed to be very cautious about how they applied this rich new source of data in the future.

She said governments were building these agreements between themselves and corporations in a bid to stamp out fraud, but that goal was not always achieved and the potential for mistakes was vast.

"What we have is this matching against datasets, trying to find the needle in the haystack," she said.

"Often what happens is we don't find the needle."

A statement from the Department of Home Affairs said the Australian Government was exploring making the Face Verification Service available to the private sector, but nothing had started at this point.

It said arrangements for private sector access would be informed by an independent privacy impact assessment and those using it would need to demonstrate their lawful basis to do so under the privacy act and where they had gained consent to use a person's image.

 

Source: Rebecca Trigger, January 15, 2018, "Experts sound alarm as biometric data from driver's licences added to government database", ABC News, http://www.abc.net.au/news/2018-01-15/alarm-raised-as-drivers-licences-added-to-government-database/9015484

Reprinted in The New Daily here: https://thenewdaily.com.au/news/national/2018/01/15/biometric-data-drivers-licences-government-database/

Furthermore, an interview with Professor Brian Lovell from the University of Queensland on the ABC further demystifies facial biometrics and the government's use of The Capability: http://www.abc.net.au/news/2018-01-15/professor-says-nothing-to-fear-from-federal/9330626

 

Kekhawatiran Semakin Terbukanya Data Pribadi di Era Digital

Kekhawatiran Semakin Terbukanya Data Pribadi di Era Digital

Senin, 15 Januari 2018 11:05 WIB

Wajah kita menjadi alat terbaru dalam dunia pengawasan digital. Dan di Australia kartu izin mengemudi mulai digunakan untuk melacak orang-orang, baik di dunia nyata maupun dalam dunia maya.

Data Biometrik

  • Sistem biometrik adalah mengenali seseorang berdasarkan ciri-ciri fisik, karakter, dan perilakunya
  • Data biometrik menggunakan data pribadi online lewat foto, usia, dan alamat tinggal
  • Data pribadi ini kemudian dicocokan dengan gambar yang terekam CCTV atau foto di internet
  • Data bisa digunakan oleh agen pemerintah, termasuk perusahaan, bahkan kelompok kejahatan

Para ahli memperingatkan data biometrik milik kita mungkin sudah rentan disalahgunakan oleh komplotan penjahat dan teroris, karena maraknya gabungan penggunaan telepon dan jejaring sosial serta adanya kamera CCTV dimana-mana, sehingga kita lebih sering tertangkap kamera.

Kartu izin mengemudi akan ditambahkan ke database biometrik di Australia, setelah adanya kesepakatan dengan negara-negara bagian dan wilayah khusus, sehingga pihak berwenang dapat mengakses informasi soal warga mereka dengan cara yang belum pernah ada sebelumnya.

Sebuah sistem yang dikenal dengan sebutan 'The Interoperability Hub' sudah ada di Australia, yang memungkinkan pihak-pihak berwenang untuk mengambil foto dari CCTV atau media lainnya yang kemudian dicocokkan dengan database foto-foto dari paspor. Proses ini dikenal dengan sebutan 'The Capability.'

Tetapi, setelah kartu izin mengemudi masuk ke sistem database baru, maka pemerintah dan sejumlah pihak swasta dapat mengakses foto, usia, dan alamat Anda.

Sistem ini sudah menghabiskan $21 juta, senilai Rp 210 miliar, sebagai cara untuk mengatasi terorisme dan membuat layanan komersial lebih aman.

Namun para ahli memperingatkan kini warga beresiko kehilangan biometrik mereka sama sekali, karena pihak komersial, pemerintah dan kelompok kejahatan terorganisir berupaya untuk mendapatkan lebih banyak data pribadi demi keuntungan mereka sendiri.

Kartu izin mengemudi jadi sumber data baru

Kartu izin mengemudi di Australia sudah ditambahkan sebagai sumber data biometrik

Pakar teknologi dan hukum, Profesor Katina Michael mengatakan sekitar 50 persen populasi Australia telah memiliki semacam biometrik visual tersimpan dalam database yang dapat diakses secara nasional. Namun dengan digunakannya kartu izin pengemudi akan menyebabkan lebih banyak data pribadi warga yang tersimpan dan membuat jumlahnya naik 80 persen.

Profesor Michael mengatakan salah satu risiko terbesar dari pengumpulan data biometrik adalah bukan penyalahgunaan yang tidak disengaja oleh kepolisian federal Australia (AFP), agen intelijen Australia (ASIO), atau agen pemerintah lainnya, melainkan kerentanan cara kerja biometrik yang rentan.

"Ini bukan seperti Anda memasukan wajah seseorang kemudian mengatakan, 'mereka adalah tersangka'," kata Profesor Michael.

"Tapi yang kita dapatkan adalah sejumlah kemungkinan... mungkin ada 15, 20, 30, atau bahkan 50 kemiripan."

Jadi, yang akan didapatkan bukan satu orang yang akan ditangkap, melainkan 50 orang yang tak bersalah menjadi tersangka.

Profesor Michael menjelaskan ini berarti bahwa meski nama seseorang bisa dipulihkan seiring waktu jika terbukti tidak bersalah, tapi masih ada dalam database yang terkait penyelidikan kriminal.

Orang diawasi secara terus menerus

Teknologi baru bisa memantau dan mengenali wajah orang di kerumunan.

Profesor Michael mengatakan metode pengawasan modern yang digunakan penegak hukum tidak hanya terbatas pada CCTV. Mereka juga sekarang bisa memasukkan sejumlah besar metadata dan jejaring sosial, yang mengarah ke konsep dengan sebutan "uberveillance" di mana orang-orang dipantau secara terus menerus.

"Sekarang kita memiliki 'jejak digital' yang ditinggalkan semua orang," katanya.

"Catatan panggilan telepon, apa yang dicari di internet, kartu kredit, bahkan data pada kartu elektronik kereta atau bus dapat digunakan untuk melacak pergerakan dan aktivitas Anda."

Apa selanjutnya?

Survei industri tahunan yang dilakukan oleh Biometrics Institute, yang dikenal sebagai 'Industry Trend Tracker', telah menyatakan teknologi pengenalan wajah kemungkinan akan menjadi tren biometrik yang meningkat dalam beberapa tahun ke depan.

Para responden survei merasa masalah privasi dan perlindungan data sebagai kendala terbesar, diikuti dengan pengetahuan yang buruk para pengambil keputusan, kesalahan informasi soal biometrik, serta penolakan dari pendukung privasi.

Komisi reformasi hukum Australia mengatakan sistem biometrik semakin banyak digunakan atau dipertimbangkan oleh banyak organisasi, termasuk program rehabilitisi narkoba, layanan pemesanan taksi, ATM dan perbankan online, serta akses masuk ke gedung.

Profesor Michael mengatakan pemerintah harus sangat berhati-hati dalam menerapkan sumber data baru yang melimpah ini di masa depan.

Menurutnya pemerintah sedang membangun kesepakatan antara pihaknya dengan sejumlah perusahaan untuk berupaya menghindari kecurangan, namun seringkali tidak tercapai dan potensi menyalahgunakan sangatlah luas.

"Apa yang kita lakukan dalam mencocokan dengan kumpulan data adalah seperti menemukan jarum di tumpukan jerami," katanya.

Dalam pernyataan Departemen Dalam Negeri disebutkan Pemerintah Australia sedang menjajaki pembuatan layanan verifikasi lewat pengenalan wajah untuk sektor swasta, namun upaya ini belum dimulai.

Disadur dari laporan aslinya dalam bahasa Inggris yang bisa dibaca disini.

Lihat Artikelnya di Australia Plus

Mandatory Data Breach Notification (2017 Amendment to Privacy Act)

Today I had the pleasure to speak to Meredith Griffiths, reporter of the ABC, on the newly enacted Mandatory Data Breach Notification (MDBN) that take effect on Feburary 28, 2018.

Some of the main points I made in the interview with the help of my colleagues at the Australian Privacy Foundation (primarily David Vaile) were:

MDBN doesn't go far enough because:

  1. small business, <$3m annual turnover are exempt from MDBN
  2. self-assessment of "serious harm" is ambiguous (on what test to companies come forward? and only if PC agrees it is serious? what if slightly serious on one view, and very serious on another- do companies take the easy way out and not disclose?)
  3. companies are given 30 days to make a data breach notification to the privacy commissioner (too long for customers to be kept in the dark and thereafter how long might it take the Privacy Commissioner to determine 'seriousness' and/or publicly response with an unenforceable determination)
  4. what about data breaches offshore (how do Aussies respond to loss of their PI abroad)?
  5. what about 'open data' re-identification thru AI/machine learning?
  6. OAIC is overloaded, slow, determinations are also unenforceable and very rare.

So where does this really leave us? We have a law that neither prevents breaches of personal information nor compensate individuals for privacy breaches. What we need to do is consider the outcomes of the ALRC from 2008 that stipulated we need a tort on the serious invasion of privacy so that individuals CAN sue other individuals (like hackers), or companies (like Google) or government agencies for breaches in their privacy (whether accidental or deliberate or through some form of negligence).

The lack of auditability of the new law means that current practices that rely on de-identification to safeguard people's personal information, say in the case of OPENGOV data initiatives, may not be enough down the track as the threat of increases from machine learning algorithms that can look at patterns of information and highlight individuals like finding a needle in a haystack. The issues of going down this path are grave- including the potential for re-identification and bringing several disaparate treasure troves together like social media data, and government data, and personal records together to be analysed.

Links to MDBN include:

https://www.oaic.gov.au/media-and-speeches/statements/mandatory-data-breach-notification

https://www.oaic.gov.au/privacy-law/privacy-act/notifiable-data-breaches-scheme

https://www.oaic.gov.au/media-and-speeches/news/retailers-check-out-mandatory-data-breach-reporting-obligations-and-prepare-for-2018

Having a statutory tort of serious invasion of privacy (like in the UK and US) or a common law tort (like in New Zealand), allows individuals to sue other entities depending on the severity of the privacy breach. Why is Australia lagging so far behind other advanced digital nations? When will this legislation be amended?

Already, we are seeing large ICT companies set up "shop-fronts" in Australia with NO enforceable penalties to international misdemeanours when it comes to amassing treasure troves of data, and data breaches offshore. How do we hold these companies accountable when they are taking in a lot of business from Australian consumers and yet seem to be let out in the "wild" to do as they please, storing data on the Cloud either in the USA or Ireland. Bruce Schneier called this "data as a toxic asset". As the toxicity rises, we can expect major pollution spills.

For now, at least we can say that the MDBN is a step in the right direction despite that it falls short through exemptions and loopholes. It can have some reputational impact on "data addicts" that don't do the right thing via their subscriber base, but little more. Sadly, large corporations can handle this reputational damage in their "risk appetites". The fines are also "measly" when it comes to government or regulatory action, and so corporate and government entities in particular are left to their own devices here in Australia. While well-meaning, it seems that it is nothing more than a theatrical show- data hosts are still not responsible for bettering their security practices or urgently responding and fixing a breach.

Data is a bit like mental illness. You can't see it. It is not tangible. You cannot put a price on mental health, and you cannot put a price on your personal data. While we can manage damage to property very well, because we can see a scratch on a car, or the loss of inventory, we cannot see data as we see a broken arm.

We already have very weak Privacy Legislation- Australia needs to get serious like Europe (through the General Data Protection Regulation, considered the gold standard) has on the value of personal identifiable information (PII). Both the liberal and labour governments need to listen to the commissioned reports by the Australian Law Reform Commission, and act on the implementation of statutory tort legislation with respect to intrusions of privacy. There is no reason why this has not happened yet.

The Sustainability Report Podcast

Posted By Rachel Alembakis on October 27, 2017 in Corporate Reporting, Fund Management, podcast. On this episode of The Sustainability Report Podcast, we’re talking about how technology is changing how our economy operates and what disruption means both socially and economically.

Innovation in renewable energy, battery storage and other areas have brought the means to transition our world to a low-carbon future within our grasp, but disruption is a two-sided coin, and companies and stakeholders must think about how their products and processes will shape how we interact.

We’re talking with Dr Katina Michael, a professor at the University of Wollongong in the School of Information Systems and Technology in Australia. Katina has previously been employed as a senior network engineer at Nortel Networks (’96-‘01) and has also worked as a systems analyst at Andersen Consulting and OTIS Elevator Company. Katina addresses the value of privacy, the impacts of artificial intelligence, and how investors and companies could frame the discussion of long term values and ethics.

Producer: Buffy Gorrilla

Original Source: http://www.thesustainabilityreport.com.au/sustainability-report-podcast-dr-katina-michael/

sustainability report.png
rachel alembakis.png

Rachel Alembakis has published The Sustainability Report since 2011. She has more than a decade of experience writing about institutional investments and pension funds for a variety of publications.