Skip to main content

One of the serious concerns for any parent with the digital ecosystem is how they keep their children safe from the ‘digital ghost’, and ed-tech could be an easy trap.

Kid safety is of paramount concern. With digital exposure, new challenges result in other types of potential harms caused by ‘digital ghosts’ or ‘digital bhoots’, which exist in the ecosystem. There is no denying that the digital exposure of children has increased. It has also gone beyond regular schooling, and parents are encouraging promoting children to opt for several courses and certifications offered digitally. This has resulted in a thriving ed-tech space, and over the past 6 months, we have even seen ed-tech companies acquiring brick-mortar education brands.

Various flavors are offered in the ed-tech space to differentiate and attract as many students as possible. Much content is being recreated to go well with this mode and facilitate easy self and virtually-assisted learning. At the same time, these ed-tech companies completely understand that they are dealing with tender ages meaning the Internet exposure to them cannot be like an adult. For this, they adhere to ‘kid-safe’ practices and ensure that they are using content that does not harm children – especially psychologically.

The ed-tech companies are cognizant of making the experience safe and children-friendly. However, that is all within the boundaries of their application, where they have all the control. But a lot more can be counterproductive to a much larger engagement who may even never cross the line to experience all measures they take to make every element inside the application child-safe and friendly.

Other than children less than 10 years of age, parents allow children to explore such opportunities independently. When in their teens, children also get a chance to recommend to parents which courses they want to pursue. In the latter case, children get exposed to the discovery of such applications, which means they interface with the promotional / advertising environment of ed-tech apps. For children less than 10, parents will primarily explore such solutions.

In both situations, the decision-makers need to get assurance that the application they want to go with isn’t harming children in any way, especially the ones which could contribute to the development of any negative trait in children as they would not have the capability to handle such exposure at that tender age. For instance, hate language, obscenity, crime, and other content.

We have clear demarcations of what is suitable for children and what is not in the broadcasting world. That is why even action content isn’t advised to children, and specific content is explicitly categorized as adult only. Due to brand safety issues, often without any intervention from the ed-tech solutions provider, the ads do get placed wrongly, which could be carrying content not suitable for children. This means that children and parents exploring such solutions could land in ‘bad areas’ of the Internet.

While children could land up in entirely the wrong territory, which isn’t suitable for them, parents would get shocked to see the affiliation of the platform they are exploring for their children. The issue could worsen as the ads are served based on the content consumption pattern and interests of the user with whom the device is profiled. In these cases, it would be a parent, or an adult, whose profile would be targeted by advertisers through ad networks, affiliates, and other mediums.

A responsible and aware ed-tech platform has to look at things end-to-end and make the entire experience safe and friendly, not just inside the app when someone is on board. This could result in the majority of the potential users having a bad experience and impression about the platform, while only the ones who convert and sign up appreciate the proactive measures taken by the app to give children their due environment.

This ‘digital ghost’ or ‘digital bhoot´is something that we need to keep children away from. Otherwise, what a ghost invisibly does in the real world to the minds by causing psychological damage, which at times longs for a lifetime, could get replicated in the virtual world with children at a very tender age.

Unfortunately, ed-tech is the platform that has a high risk of carrying this invisible ghost, harming children while attempting to do better for their overall development. Brand Safety is a crucial thing to address for any brand, especially those dealing directly with potentially vulnerable sections like children.

mFilterIt is already engaged in this space with a few proactive ed-tech platforms piloting some activities with us in this direction. However, this should become an industry/vertical hygiene where the objective is to make the entire experience children safe, not just inside the app, which is a controlled world for the platform.

 

Get in touch with our experts for deeper insights. Reach out to learn more!

 

Leave a Reply