prepare for a brain dump
30 Dec
I hope I’m not re-inventing someone else’s wheel…
While coding a new page design, I finally got fed up with calling external JavaScript code which would use any form of document.write() method, since the page won’t “finish” loading from a client perspective until the JavaScript is finished being downloaded & executed. Placing such scripts in the <head> of the page is even worse as the page won’t even begin to render until all pieces within the <head> are retrieved.
The new methodology designers are using is to embed AJAX (JavaScript + XMLHttpRequest object) functions to embed additional page content (by updating the content of tags with .innerHTML or other aspect of the DOM), since the page will finish rendering and not “lag” waiting for that code to execute. However, this still doesn’t address the issue of the browser still having to wait for the library to get fetched & loaded in the first place.
I’m not certain if anyone else is doing this, but I decided to try a new method based on AJAX to further speed up page loading. The caveat of the method is that a page might finish loading before some JavaScript features have fully rendered, but my feeling is that a faster-loading page in general is much more acceptable to a visitor than one that loads slower, even if the faster loading doesn’t display the enhanced functionality immediately.
Here is what I’m doing:
Let’s say you have a bit of AJAX functionality inside of an external JS file. You still have to wait for that file to get fetched before the browser starts to execute it. By embedding a simple AJAX function into the HTML itself, the JS file can be fetched via AJAX (essentially the same speed it would take the browser to do it anyway without AJAX). Multiple JS files can be fetched in a single call. Then, I wrap the JS inside of <script> tags, embed inside an empty <div> with an id of “ajax” and then loop through this code:
var div = document.getElementById('ajax'); var scripts = div.getElementsByTagName('script'); for (var i = 0; i < CCx.scripts; i++) { eval (scripts[i].text); }
What this effectively does is allow the page to avoid waiting for external JS files to load before considering the page to be finished loading. This may be more of an improvement on older browsers but even on Firefox 3.5 and Internet Explorer 8, I’ve noticed an improvement. It also dodges an issue with Firefox causing pages to lag when the the Avast virus scanner is running on the client computer.
The main thing still to test is caching, as this method may be moot if the browser re-fetches the JS file on every page load, but I believe all modern browser treat an XMLHttpRequest as if it was requested normally by the browser and will cache those results. If you don’t want caching, like if your JavaScript code output is server-side dynamic, such as generated by a CGI, then just append a random string as a URL parameter to force a fresh request every time.
12 Dec
Facebook recently updated their privacy features which were supposed to enhance member privacy, in particular who might be able to see you in the context of networks as well as who has access to see your friend list.
One public relations issue they had was with the confusion of the description of the options, in particular for those members who had not previously defined a specific privacy setting for certain things and who were then presented with an option to make their friend list visible to “everyone”. Default thinking for most people is to interpret “everyone” to mean those limited to their friends when in reality it means, literally, everyone. That confusion alone caused many members to expose their friend list in a way they weren’t intending.
Beyond that point, there are those who believe that even if they understood the settings that their friend list is, in fact, private if they set it that way when, in reality, it’s not as private as they think. I will explain in a moment, and once the issue is understood then the hope is that it will educate people to pressure Facebook to be more serious about privacy and not simply the perception of it.
For the sake of argument, presume you have a Facebook account and friend list of 100 people. Let’s presume you’ve set your privacy settings very strict, limiting only your direct friends to be able to see your friend list. Or, even made your friend list totally hidden to everyone. The perception you are given is that now your friend list is “private”, invisible to outside prying eyes. And, to a casual observer, it would be - anyone who has any access to your account or even just a knowledge that you have a Facebook account, even if that account exposed other things you wanted to expose, would never see who is on your friend list.
Here is where this “privacy” setting fails. In fact it fails in the key way that it’s probably intended to NOT fail: It does not protect your privacy to outside “crawl” companies as well as it would a casual person perusing your profile. For simplicity sake, presume that Google is a third party interested in finding out as much as they can about every Facebook profile they could. Now consider that even though YOUR friend list is set to private, the friend lists of your friends may not be. You cannot control their privacy settings any more than you can control how your privacy is extended when any of your information which otherwise you make available to them would be controlled.
For a company like Google, with the time & resources to be able to crawl just about anything made available to the public (and even things not publicly available depending on their deals with various sites), if they can crawl all your friends’ profiles and see THEIR friend list, they could see you on those lists and, through inference, be able to build up a profile of your friend list - a list you have effectively set to “private”. The amount of this list which gets exposed is limited only by how many of your friends have set the same privacy settings as you. Judging by the default behavior of most people, it’s fair to say that, on average, approximately 80% of any given person’s friend list on Facebook would be exposed even if that person’s friend list has been set to the most private setting available.
Facebook can solve this, but it’s unlikely they will because in order for them to get value out of their network they are pressured to ensure that even if they offer “privacy” settings, those settings are effectively easy to break down by outside crawlers. It’s highly likely they’ve even provided such outside parties an exact road map on how to make the inferences.
How can Facebook solve this? It would be as simple as offering an additional privacy setting which allows members to determine whether their visibility on their friends’ friend lists would expose them to any depth beyond those people’s direct friends.
I recently got asked by a friend why I might want to limit people knowing who my friends are. It’s got nothing to do with that. I have no problem with my friends knowing who my other friends are, but I don’t want Google or other such companies of questionable trust or intentions to have this information unless I allow them to have it. This could be controlled and within my ability to control if Facebook would provide a privacy option to allow me to define how DEEPLY I am seen on others’s friend lists beyond those people’s direct friends.
Keep this in mind the next time you consider whether large corporations like Facebook or Google are really giving you control over your privacy or if what they’re really doing is pulling the wool over your eyes. Consider that they are making you believe they’re protecting your privacy when what they’re actually doing is setting up private pacts with each other which allow them to expose whatever it is about you they want to expose to each other in order to maximize the value of your data for their self interest.