New app idea – one that detects if you’re listening to a podcast and pauses if you type for more than a couple of seconds.. then re-starts when you hit “send” on that email which has made you miss the last 3 minutes of your podcast.
‘An automaton of the internet’: Selfie in front of maimed woman sparks ethics debate http://ow.ly/psm530kmhuA
What happens when we train an image recognition AI with images from the “darker side” of the web? http://ow.ly/uYDm30kl7YN
Facebook pilloried for not protecting user data, promises to do better. European government implements laws protecting user data. Facebook moves user data to US to avoid new laws. Actions speak louder than words. http://ow.ly/OOTX30jAf7b
A new social contract which underpins how and when we share data must be based on ethical use of that data by those who control it. http://ow.ly/5Inj30jk90f
Today’s raid of Cambridge Analytica’s offices followed revelations that the company had been involved in harvesting data from 50 million Facebook users and then using that information to influence the USA’s 2016 presidential election.
After their stock lost $60 billion in value, Facebook apologised for the “breach of trust” and promised to do better in future.
However, Mark Zuckerberg’s potted timeline of related events leading up to the incident makes it clear that the Facebook Platform, released in 2007, was operating as intended. It was specifically designed to “enable people to … share information about [their friends]”.
In addition, Paul Gewal, VP & Deputy General Counsel for Facebook made it clear that this was not a data breach and that all data had been obtained from users “knowingly”.
So why is Facebook facing a backlash from their investors and users, if the system operated as expected and was not compromised?
The answer is that Facebook built a system without considering the ethical implications of its functionality. The system presented a request for authorisation which its authors knew users wouldn’t read or understand and it then shared private information about other users without their knowledge.
We have names for those who exploit human weakness for their own gain – a shyster, a con artist, a fraud. We also have names for those who share private information about their friends – a gossip, a grass, a rat – a traitor.
Digital systems are now such an integral part of so many human activities that it becomes imperative that we view their operations, both intended and unintended, in these terms. Digital ethics should now be a fundamental consideration for all organisations building or implementing digital systems with the power to influence the lives of their users and their staff.
Instead for Facebook, their early focus was on gathering as much as data from their users as possible and then building ways monetize and share that data as widely as possible.
Facebook’s early motto was “Move fast and break stuff”
When they went to IPO, Zuckerberg announced the 5 core company values, none of which referenced treating users or their data ethically:
- Focus on Impact – “work on the biggest problems”
- Move Fast – “prioritise grasping opportunities over avoiding mistakes”
- Be Bold – “take big risks, don’t be afraid to fail”
- Be Open – “share as much information as possible as widely as possible”
- Build Social Value – “give users the opportunity to express themselves online”
It is these values, bereft of ethical consideration for the end user which spawned the digital attention crisis and this month resulted in betraying the trust of millions of users.
Taking a conservative estimate that 50% of the funds raised by Facebook through to 2007 were invested in product development, Facebook spent around $20m to build the Facebook Platform.
Not building in digital ethics cost them $60 billion.
[All opinions are my own and do not represent those of my employer.]
As a developer, you need long, undisturbed periods of time in order to concentrate on the problem at hand. Your best, most productive time is when you are in ‘the zone’.
Any developer worth their salt has experienced this state, where the outside world disappears, you’re holding 20 variables in your head at once and are being incredibly productive. A lot of job satisfaction is generated from this place; I personally equate it with the creative satisfaction that artists experience. It’s powerful stuff.
The bad news: To introduce even minor managerial responsibility – say a team lead role – into your job description, is to receive a lifetime ban from ‘the zone’. Once you become a hub for communication and decision-making, your chances of getting enough unbroken time to get into that effective, productive state are effectively zero. You are likely to be regularly frustrated at the constant interruptions of your colleagues.
The fix: I’m afraid the only effective solution I’ve seen to attempting to split your time between coding and managing is to give up on one or the other. If you’re attached to the management element of your role, you should seek to also ramp up other non-coding activities that can be completed in shorter bursts.
Design, documentation, code reviews, coaching and client interaction all fall into this category. If you’re attached to your coding, don’t be afraid to say as much and just pass up the whole management thing altogether.
Maker’s schedule vs manager’s schedule: http://paulgraham.com/makersschedule.html