The internet is broken as the multiplication of cyberattacks, frauds and piracy show due to the poor literacy skills from users.
Systemic Automation = FAIL
The software industry is failing in many aspects because automation is not, contrary to what is told, an exact science. Considering big data as free of errors and able of self regulating with the machines is more of an ideology than a reality. Indeed, even with a quantitative approach, the regulation by human is needed as machines are unable to formulate an opinion and lake abilities in decision taking.
Firstly, many commentators have pointed to the problem of false positive or the ethical considerations of testing models, but there is also the problem of formulating assumptions prior to the analysis instead of experimenting hypothesis in a dedicated time before applying. In economically driven economies, scientists will choose between the less costly option minimizing the benefits of looking into knowledge thoroughly.
By doing so, data scientists introduce bias to measures for results and manipulate data to make them fit the frame they designed before any inductive observation. The lack of fit between prior hypothesis and ground resulting from large scale data collection is amplified by the absence of any further process of verification via the mean of human intervention.
Apart of the issue of reflexivity; weakness and over confidence in results from data scientists are producing falsification in results as machines exclude many significants and pertinent data because they are not apt to human critical thinking also based on experience.
Results are often based on the main criteria of repetition that is not effective if not coupled with other dimensions. Another mistake made by quantitative scientists is correlating dimensions that are not commensurable conducting to a lack of scientific evidence.
Transcending the dichotomy between the pessimistic or the optimistic views on science, media and technology, the aspects of quantitativist rifts pulled by business only approaches are serious limitations to science discovery as the lack of effectiveness/usability of solutions and products do not reach the stage of development and successful implementation (half of it only are successful by the way). The necessity to proceed to complex manipulation as well as the silos structures deeply limitate the promisses of the big data for fostering global problem solving.
These scientific models are based on abstract paradigms that are not showing ethical considerations for practical realities, as neither positivism or post-positivism – the model of infaillibility raised by the technological determinism – neither constructivism – that is more an epistemological gageure than a sustainable model having prooved its applicability to concrete cases, are apt to build integrative technologies. We need more participative approaches and more action research.
Big data = environmentally costly
Big data are wasting ressources : waste in the development of products that will not serve societal or economical needs but only the mere logics of individual benefits (everyday some researches are published that underline the lack of impact of most technological equipment in practices), waste in bandwitch consumption because of the permanent alienating upgrading of apps, softwares and solutions and the multiplication of channels and platforms offering ten times the same service without specification, waste in storage with ghosts that populate many platforms, multiple inactive accounts created for the purpose of a service that is later abandonned, hacked accounts or transmission fails and bugs and finally the waste of human production of cultural goods as many knowledges and knowings that are not reaching the required state of visibility to be detected by automated processes are irremediably lost.
In brief, out of mind automation will create more alienation in our post-liberal societies.
It is where critical theories coming from post-colonialist, feminist, anti-racist or anti-authoritarian studies and qualitative researches conducted following the right ethical and methodological principles can lead to the definition of neo-paradigms for research that are more integrative of the social and practical dimensions of the categories of experience and locally produced knowledge. By analyzing and interpreting the micro-contexts of news production with the observation of the historical and cultural backgrounds, digital sociologists and qualitative researchers are providing findings that can adress the « glocal » (the juxtaposition of the local and global dimensions) distopia in order to resorb the limitations encountered with big data only, by complementing and qualitatively enriching them.
Licence Creative Commons
Ce(tte) œuvre est mise à disposition selon les termes de la Licence Creative Commons Attribution – Pas d’Utilisation Commerciale – Partage dans les Mêmes Conditions 4.0 International.