#MyTrueSelfie for The Diana Award
Tackling low self-esteem, body image and bullying with technology
More than a million selfies flood onto our social network feeds every day. But how many of these go beyond our outward physical appearance?
Youth charity The Diana Award approached Octophin Digital to build a selfie app with a difference. By swapping filters and face-reshaping with positive words about a person overlaid on an unedited photo taken in the moment, they wanted to highlight the deeper positive qualities that make us who we are.
See the finished product at MyTrueSelfie.com and people's selfies (including selfies by Monica Lewinsky and James McVey from The Vamps) at #MyTrueSelfie or read on to find out more about how the app was built.
We decided that for such a campaign to work, it would have to be quick to use regardless of whether you’re using a laptop, phone or tablet, not forcing selfie-takers to install an app. So we built it on the greatest standards publishing platform the world has: the web. We’re so glad we did, but recreating camera capture and upload, cropping, annotation, emojis, sharing and moderation behaviours people are used to from device-native apps inside a webpage that works for as many people/devices as possible was a big challenge.
We wanted to give The Diana Award as much control over content as possible so built a WordPress site around the app which they could update and blog from, also including a custom Twitter feed with a moderation system to pull through only the tweets on the #MyTrueSelfie hashtag they wanted to showcase by using the Twitter API to pull favourites from an account that use the hashtag.
We built the core application in node.js, utilising the GraphicsMagick library for image processing, Twitter’s emoji library (twemoji) to make sure everyone's emojis looked the same across devices and lots more open source tools for handling touch events like scaling and rotating and the capturing of the finished annotated image. We made an animated gif mode by combining separate images into one on the server to make it more fun.
The app needed to be fast, scalable and reliable, so we even went as far as simulating a whole browser on the server side in order to take a screenshot of the annotated image to make sure it performed the same for everyone. For a database we used the fast linvodb software which allowed us to keep the database and the rest of the app in one place making deployment and development easy.
Doing something at the very edge of web standards meant browser compatibility testing was a big part of the project. We used the BrowserStack platform to test lots of devices but, being something that used a camera and uploading, we also needed to test on lots of real physical mobile devices.
At the time of release, iPhones don't support direct camera capture without leaving the web browser (known as WebRTC). Android phones do. For Android we use progressive enhancement to detect support of this feature and show a person’s camera directly in the page, for iPhone this falls back to a simple upload/take photo button. When iOS 11, which does have WebRTC support, comes out, the app will enhance for iPhone users too automatically as the support for WebRTC will be detected. One of the key strengths of the web is its backwards and forwards compatibility with older and newer browsers and devices. We wanted to keep as true to this as we could.