This is good stuff. Yes, of course, an algorithm can be written for a biased purpose.
One thing that may help is using open source software on your computer - software where the source code is freely available. This is why many people use Linux distributions rather than Windows. (Maybe not the bias-issue specifically, but because many people like to know what's in the code).
I personally use Windows but I like Linux distributions as well.
Now as far as the internet goes, that one is trickier. The code is written in PHP or other server-interpreted languages that are hidden from the user.
There's certain aspects of the internet where we just can't do much about the bias other than maybe create competition that has a differing bias or no bias - the latter which seems less likely since we're all bias. I might have a liberal bias and a script I write might reflect that (I generally don't do political-related websites, other than I do maintain The Climate Registry's website, and that may be semi-political). Another person might create some conservative or libertarian-bias program.
With that being said, it seems to me that sites like Google, Youtube, Facebook, etc aren't necessarily bias, but rather they cater ads towards one's own biases. And they use algorithms to determine your biases. Facebook determined that I was politically moderate. The fact is I'm liberal, and that exposes a problem with their algorithm, but that's beside the point. I have other friends that are conservative. I would get many moderate to liberal political ads, and occasional conservative ads. My conservative friends would often get conservative ads, and then other liberal friends would get more liberal ads.
It goes beyond politics but politics is probably the easiest to speak of in this regard. If Facebook sees someone likes fast food, they'll start showing more fast food ads to that person.
Now I wish Youtube should stop showing me Fiverr ads. I don't like Fiverr, I have issues with them, yet Youtube still thinks I want to see their ads.