State , Siri And You

The Indian Express     16th October 2021     Save    
QEP Pocket Notes

Context: Plans for greater deployment of AI and harvesting of data without any rights paradigm or data protection law is deeply unsettling.

Algorithmization of everyday life

  • Artificial Friends: Future progression of technology envisages children having AFs, that is, machines with AI that are programmed to respond to human beings and, as they observed more, they stored more information about their owners and knew how to respond to them.
  • “Deep learning” tools: Where artificial intelligence programmes are able to absorb information and start demonstrating reasoning of the kind which distinguishes us as humans.
  • Deepening information apparatus: Where citizens have vital information on themselves stored as part of state or private data platforms such as
  • Widening state-information collection for Aadhaar, for vaccinations on CoWin, from our tax returns, from our driving licence and a host of other instruments by Government.
  • Private platform footprints: Our opinions, our likes and dislikes, our ideologies on platforms like Facebook, Twitter and a host of others.
  • Digital entertainment: We now consume popular culture on the privacy of our laptops through OTT platforms like Amazon Prime, Netflix or Hotstar, instead of going to the theatre.
  • Creation of Data Market Place: Niti Aayog National Strategy for Artificial Intelligence (2018) points to creating a data marketplace to bring “buyers and sellers of data together”.
  • Digitisation of life: The 2018 Niti Aayog National Strategy for Artificial Intelligence points to the greater need for AI in sectors like education, healthcare and agriculture.
     

Impediments of over algorithmization

  • Ethical questions: Can everything be mechanised? Is love substitutable?
  • Surveillance threats - Pseudo mass controlling: Based on data, algorithms drive news and information that aligns with our beliefs to our Twitter/Facebook selves.
  • Political misuse of data by Cambride Analytica.
  • Identity crisis: These tools harvest what makes us human and defines our personalities based on our personal data.
  • Technological threats – Algorithmic bias: After all, the data being fed to create the algorithms reflects the opinions of the programmers feeding the information.
  • Is the zip code of where a person lives likely to indicate chances of committing fresh crimes? If you live in a white neighbourhood, are you less likely to commit a new crime?
  • Legal gaps in India: There is no data protection law in place, even though a Bill is being discussed by the parliamentary committee on information technology.
    • Lack of access: Citizens have no rights over their data or protection from its extraction and in general, against its misuse.
    • Privacy and security concerns: State has unilateral rights to collect and use our data, it has also given itself the ability to regulate private parties.
    • The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 are used to mandate that WhatsApp, which uses end-to-end encryption, must enable the identification of the first originator of the information.
    • Rights under threat: The current legal regime violates constitutional premise that citizens must have their speech, expression, intellectual property and liberty rights protected.
  • Drive towards monetisation: NITI Aayog paper envisages that state policy includes creating a data marketplace — a “deployable model” in which it seeks to bring “buyers and sellers of data together”.

    Conclusion: Data protection regime shall be redefined before further expanding algorithmic footprints.

    QEP Pocket Notes