Since the early 2000s, the implementation of cameras in handheld devices and personal computers has led to a distinct increase in video recordings of human interaction and behaviour. User generated content is the driver for more and more audio-visual websites, but it is important to consider the ethical, legal, and social implications of these content rich resources, to ensure the new AV websites abides to the law with human needs in mind.
In this report the question of legality is broken down to, data & information governance, information management, human rights, and copyright. Using these to begin the analysis the report will delve deeper into how human interaction can make legal issues more complex. Using moral reasoning brings the analysis into the ethical implications surrounding the uploading, moderating, and maintaining audio-visual hosting websites. The ethical argument will be looking into impartiality, balance, bias, privacy, and the public interest.
Disinformation and misinformation are key areas of study too. Noticing the difference and implementing rules to help minimize it is something that we have seen with Facebook and Covid-19, whereby systems have been put in place to prevent the spread of “fake news”. This is ever present on all areas of the web, with the ability to upload video content of any genre, information can lose its validity every time it is posted as confirmation bias and angle can chip away at the accuracy.
Being online comes with social issues such as human-business interaction & product influence, political affiliation, and bias. Looking into the ways public perception can alter or influence a website, and the content it predominantly produces, can help predict or prevent destructive and questionable behaviours/interactions especially on user-controlled websites.
“”The brands that will be big in the future will be those that tap into the social changes that are taking place.” Sir Michael Perry, Chairman of Centrica PLC”
The rise in online activity has seen a path to a myriad of socio-economic issues. An AV website can have a wide array of uses- for good and for bad so in this report the impact of this social aspect will be discussed. In addition, a recent study looking at the need for self-disclosure on the internet, has outlined the need for balance between privacy and exposing information (Nicole C Krämer, 2020). Self-disclosure is almost unavoidable in user created visual content, but this can create huge ethical problems such as security. Optimising a website to protect but promote information is key for the success of, this report will investigate methods of privacy protection and possible weaknesses in the service with people in mind.
Data quality is measured by its accuracy, timeliness, relevance, completeness,
trustworthiness and contextual definition (Lai Kuan Cheong, 2007). Any IT infrastructure must be managed in a transparent manner, accounting for internal controls that are there to prevent fraudulent data or usage. For personal data websites must ensure there is data governance in place with information management working to protect the people using the website. GDPR is expected to have implications on any website cybersecurity policy and practice as it requires implementation of data protection measures for personal data and privacy, against data loss or exposure (He Li, 2019).
Adopting a holistic approach focusing on people, processes, and technology is key. The main issue here lies with the inability to mitigate accurate content being uploaded, instead using adaption and removal of content that show signs of untrustworthiness. As we can see in figure one, a corporation has an ordered way of maintaining data governance and improving data governance. However, when social based websites think about data governance there are many soft properties that obstruct this simplicity (see appendix 1).
Human and Artificial Analytics: Fraudulence of Data
Human powered analytics is a tool wildly used to monitor and moderate websites such as Reddit, it provides a thorough service whereby everything is scanned with intent and contextual understanding. However, AI-powered analytic tools are on the rise (Darren Edge, 2018). It is possible to use algorithms to scan posts on social media and video hosting websites to locate breaking news stories (Raz Schwartz, 2015) to check the data quality. The AI includes stance classification to determine whether a headline agreed with the article body, text processing, and image forensics to detect fraudulence. For breaking news article anomalies are detected by measuring correlations with the posts uploaded with same/similar headlines. These anomalies are then highlighted to the human counterpart to quality check.
Expression, via social media platforms, is subject to various provisions from under both the International Covenant on Civil and Political Rights (ICCPR) and the European Convention on Human Rights (ECHR) (Coe, 2015). The hierarchy of expression starts off with artistic content, which makes video hosting sites a high value service of expression due to the creative nature of its existence. However, the legal system is clear, there is a right to offensive or shocking expression. Due to online AV websites being an open, content-creator based design- the potential for media to conflict with freedom of expression happens when the expression is criminalized e.g., hate speech. The implications of this can leave the organization under scrutiny.
A large percentage of videos uploaded on audio-visual hosting websites such as YouTube contain content that could violate YouTube’s community guidelines. Whereby the videos contain copyright violations. This can be put down to the low publication barrier and the anonymity online. A recent study has used a scalable and real-time video crawling system to collect video metadata to detect copyright material (Nisha Aggarwal, 2014). This is a valuable tool to be considered due to the issues with legality regarding copyright infringement. For a person alone copyright infringement can result (in the UK) in 6 months incarceration and/or a fine of up to £50,000. If the host has paid for the rights for media such as songs and video clips, then there are no issues in hosting the videos present. However, if a user uploads content that subjects the business with legal issues it is within the right of the business owner to remove that video using human monitoring or metadata scanning to detect such an infringement.
Ethical and Social Impacts
Moderating and maintaining audio-visual hosting websites can evidently cause legal issues. To think of the broader implications of AV websites and alike, the concern around privacy and security, politics, and bias must be mapped out. Ethics can become complex online as there is no limit to opinions and input from users, making it harder to appease everyone. It also means that it is hard to predict what people will use services for, which is where security and privacy becomes paramount.
There are protocols in line for account security and online safety such as password protection and end-to-end encryption, preventing MTM attacks (Song, 2020). This is effective for the most part, and it is regarded quite a lot harder to access accounts via spam, phishing or hacking for media sites such as YouTube. This does not rule out the possibility of breaches, however.
In 2012, professional networking site LinkedIn suffered a breach, compromising the login of over 100 million accounts. Although the passwords were hashed, the passwords were cracked and sold online, exposing the authentication credentials millions of users. Storing encrypted passwords is a risk since everything that has been encrypted can be decrypted. To add, encryption keys are often stored on the same servers as the data being encrypted, a breach can occur and be catastrophic. SHA-1 (specific hash function for LinkedIn) failed to include salts, instead stored hashed passwords directly to the server (Gune, 2017). Salting is the addition of randomly generated numbers to a password, for e.g., the password 123456 has only one form when hashed, the sequence can be reduced to the original form. If you add a 2-bit salt the forms of the hashed password can increase by 22 times.
Content creators will use AV websites for promotion of their work and personal employment. This employment relationship can have negative implications as it forms a relationship based on an inherent asymmetry of power and reliance on consent for monitoring and screening (Dimitris Gritzalis, 2014). If an employee has their means of personal funding removed, a popular video wrongfully penalized, then they will not be able to maintain their quality of life as it functioned previously. In that case, the importance of data governance supersedes the importance of a person’s emotional and economic needs.
In a study (Lee Rainie, 2013) it was shown that 21% of people having social media/email hijacked and 11% have had vital personal information stolen, such as credit card details. The creation of more websites with sensitive personal data could create more issues, as more information becomes public, people become desensitized to the severity of data being posted online which does pose the question if privacy can exist online (Nicole C Krämer, 2020). Most internet users are aware of their personal information online, and how the information can be used. Roughly 50% of users are estimated to be worried about the amount of information about them is online, a 33% increase than in 2009 (Lee Rainie, 2013). This increase can be worrying, but also means that people are now more aware of the implications themselves, therefore websites are now more likely to be used in a safe and thoughtful manner.
Political Profiling and Social Implications of Bias
Political beliefs and affiliation have been a cause for social marginalization, prejudice. Online video hosting websites are a social medium that can support the study of users’ political affiliation, namely the feeling of anonymity enables users to express their political beliefs, even the most extreme ones. This has led to issues regarding the targeting of people with adverts and swaying political views, such is evident from the Cambridge Analytica scandal. Videos uploaded onto such AV websites, can be used to target ads to the viewers and the content creators- the community could question the intentions of the website. Hosting services should not be pushing an agenda unto the people who want to use the services for aspects unrelated to the targeting media.
Like above, the ability to target individuals with content that can be highly emotional or containing arousing messages, will elicit responses based on the personal impression. This impression and opinion will be formed around influence not necessarily fact. A news coverage concerning the use of lean finely textured beef (LFTB) is one such example, evidence suggested that coverage (mainly video-sharing sites) may have had a negative impact on public perception of the manufacturers (Patric R. Spence, 2016).
To ensure data governance, human rights and copyright can be followed effectively it is important to maintain up to date software for monitoring uploaded content and hire staff that will be trained in the implications of what can happen if legal protocols are not followed properly. These protocols will need to maintain a holistic approach as the ethical and social implications of an audio-visual hosting platform can broach far and wide. With privacy and security being a huge aspect of online services, it is extremely important that up and coming websites employ the best account encryption and data protection to serve its users. In addition, for future AV websites to have reduced bias and increased impartial approaches to content being posted, metadata crawlers or AI could be implemented to scan for problematic material for staff to then quality check without inferring their personal bias onto the content being analysed.
Caplan, S. E. (2002). Problematic Internet use and psychosocial well-being: development of a theory-based cognitive–behavioral measurement instrument,. Computers in Human Behaviour, 553-575.
Coe, P. (2015). The social media paradox: an intersection with freedom of expression and the criminal law. Information & Communications Technology Law, 16-41.
Darren Edge, J. L. (2018). Bringing AI to BI: Enabling Visual Analytics of Unstructured Data in a Modern Business Intelligence Platform. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, 1-9.
Dimitris Gritzalis, M. K. (2014). History of Information: The case of Privacy and Security in Social Media. 1-25.
Gune, A. (2017). The Cryptographic Implications of the LinkedIn. 1-7.
He Li, L. Y. (2019). The Impact of GDPR on Global Technology Development. Journal of Global Information Technology Management, 1-6.
Lai Kuan Cheong, V. C. (2007). The Need for Data Governance: A Case Study. ACIS, 100.
Lee Rainie, S. K. (2013). Anonymity, Privacy, and Security Online . 1-35.
Legewie, N., & Nassauer, A. (2018). YouTube, Google, Facebook: 21st Century Online Video Research and Research Ethics. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 19(3), 21.
Nicole C Krämer, J. S. (2020). Mastering the challenge of balancing self-disclosure and privacy in social media. Current Opinion in Psychology, 67-71.
Nisha Aggarwal, S. A. (2014). Mining YouTube metadata for detecting privacy invading harassment and misdemeanor videos. Twelfth Annual International Conference on Privacy Security and Trust, Toronto, ON, 84-93.
Patric R. Spence, D. D.-R. (2016). Social media and corporate reputation during crises: the viability of video-sharing websites for providing counter-messages to traditional broadcast news. Journal of Applied Communication Research 2016, 199-215.
Raz Schwartz, M. N. (2015). Editorial Algorithms: Using Social Media to Discover and Report Local News. The Ninth International AAAI Conference on Weblogs and Social Media, 407-415.
Song, S. (2020). Keeping Private Messages Private: End-To-End Encryption on Social Media. 1-12.