What's the policy on previous versions of libraries?
I noticed, for example, that only the latest version (3.1) of the jQuery plugin Nivo Slider is available. The last versions (<= 3.0.1) are returning 404s.
Given that the previous version was released in May 2012 that's an insanely fast deprecation policy for a CDN.
I have to guess that either I've stumbled upon a bug or minor oversight in this case, because I can't see how removing old versions that quickly would be at all a workable solution for 99% of use cases...
Maybe they'll only support version that were released from the date that library was added to the CDN. In other words, hopefully once a library+version are added to the CDN, it's supported for a long long time.
Digging deeper, I see that libraries are added via pull request and that the Nivo Slider library was added 11 days ago. Other libraries that have been there longer (such as jQuery) continue to have old versions.
"This page (http://cdnjs.com/) is currently offline. However, because the site uses CloudFlare's Always Online™ technology you can continue to surf a snapshot of the site."
I always prefer compiling and gzipping the whole site's assets to just one .js and one .css file and serve them through CloudFront. It's usually around 100-200 KB and CloudFront's latency is very low at most places.
This saves a lot of requests and waiting time between page loads (i.e. the first page is always slower, but subsequent page loads take almost no time because there're very few (or even just 1) requests needed to make.)
Cloudflare pays for this. They've been sponsoring cdnjs for around a year now.
As for the security of our system, all javascript files are verified against official sources before going on the cdn. Additionally, we have many library maintainers submitting updates to their own libraries.
Beyond that the only question remaining is our personal integrity. Like any relationship with a third party, you're going to have to decide whether trusting us is an acceptable level of risk. If past performance is any indication of integrity, we have had no security incidents since we began in January 2011.
I assume you are the creator of cdnjs - what are your relation with cloudflare beyond just the sponsorship? Are you working for them? Do you advise them?
I'm wondering what is the performance increase delivered by cloudflare. I've heard many mixed opinions and I'm at the interesection where I have to decide whether I'm using them or others.
btw - kudos to Cloudflare for sponsoring this - seems like a great way to put yourself in front of developers.
If I'm guessing what the names of your Pingdom checks mean correctly, they seem to show CloudFlare making your response time 10ms slower. I'm assuming that cdnjs.cloudflare.com is the site with CF in front, and cdnjs.com is without, not sure if that's correct.
Looking at the headers both are being served by CloudFlare. And cdnjs.cloudflare.com is being redirected to cdnjs.com. So, I suspect that that's where the extra time comes from.
You know, I'm just cynical enough to believe that security problems have little or no bearing on stock price. However, reputation is a kind of stock too, so from that perspective I agree, public CDNs like these have little to gain and lots to loose by tampering with these libraries to inject, say, a behind-page pop-up with ads.
What do providers get for access to these type of files? Can they capitalize on the information gathered from file requests? Adding to their knowledge of traffic patterns etc...
cachedcommons claims to have been around since 2009. cdnjs claims to have been around since 2011. Im not saying one is better than the other, but i am implying that cdnjs is not 'the missing cdn'.
That's pretty awesome! I've always wondered why popular css grids weren't also CDN'd. Bootstrap, 960, etc, etc - it makes great sense to have many of these things cached across 100s of sites 1 time - instead of 100 times. Modify the cdn version in a seperate style sheet afterwards.
I suppose if you have a big trust issue letting Cdnjs host your libraries or if you have a customized build of one, you could just do what they did and sign up for CloudFlare and control the files yourself. [edit: CloudFlare, not Cloudfront]
Ok, I'm not a developer and I'm probably missing the big picture or smth; my question:
- All this fuss is about hosting a few text files none bigger than several KB??
Who in this world does NOT afford to host a few small files nowadays?
Performance is more dominated by latency than file size, and folks usually care about this more for page load time of a page than bandwidth cost. CDNs are usually better than hosting on your server directly, because they have better latency properties (many CDN endpoints which are more likely to be "closer" to the end user).
The advantage of sharing a CDN (as opposed to every site having it's own) is caching. If I visit website-a.com and they include jquery.js from cdnjs.com then I visit website-b.com that also uses the same file, I don't have to download it twice and website-b.com loads faster than if it served that file itself. That's the big win, IMO, but unfortunately depends on cdnjs having a high density of use. Even still, if you can shave a 100ms of the loading of your page that can matter.
The idea is that users wouldn't have to download the same jquery or whatever script over and over for every site they visit. It makes less sense when it comes to not-so-popular script files, but I guess there's a convenience factor too.
We need to extend the baseline notion of what the web is. If some nontrivial number of sites are using (say) jQuery, then it would be a good idea to have a way to declare "SCRIPT SRC jQuery version x.y.z" and let the browser figure out where it lives. Then you fetch it once, parse it once, and run it many times, no matter what site you may be visiting.
Or at the very least, we need some way to say "get this script from this URL, but only if it hashes to <this value>, since otherwise it's been compromised". Why worry about CDNs when you can design the script-switcheroo attack right out the system in the first place?
It seems to me that the best way to handle this would be a content-addressable system with a more traditional fallback. You'd declare that you want file with SHA-256 (or whatever) hash of XYZ. If the browser has it, then you're done. If the browser knows where to find XYZ, then it can go off and grab it however it feels like. For compatibility, you'd also specify one or more traditional URLs where you think that content XYZ can be found, and the browser could use the hash to verify integrity.
The trouble with the current CDN setup is that you only get the maximum benefit if everybody uses the same CDN, but people don't necessarily want to trust Google or whoever to host code that their site relies on. With a content-addressable system with fallbacks, you'd get all the benefits of the CDNs with none of the drawbacks.
I think we already have this system - it's the browser's cache.
Consider the following sequence of steps:
1) add the "SIG" values to the tags in the HTML page that have SRC attribute. Be it scripts, or images, or iframes, or what not. So far, we've just bloated the page a bit with no good effect for the user.
2) update the code in the browser to calculate for each resource in the browser's cache the values of a few "frequently used" signatures, and allow the signature-based access to the content [compressed trie?], in addition to indexing the cache by the URL. Now, the "bloat markup" from step 1 starts to kick in - and you can reuse web-wide all sorts of resources - scripts, downloadable fonts, artwork, whatever. At this point the user spends the time only on the first download of the file, even if that file comes from a very slow VPS.
This approach could dramatically decrease the load on the CDNs for the frequently-repeated content, and get the latency down to near zero, so it would be much better than even the ISP-hosted CDNs.
Maybe anyone is reading this who is familiar with the FF/Chromium codebase to comment on the feasibility of such an approach ?
I thought you had in mind a totally new content delivery mechanism to fetch the data by hash from the network, relative to that adding the content addressability to browser cache is near trivial. Apologies if I misunderstood.
I was thinking that this could be added as well, but that it would be purely optional, if anyone got around to adding it.
Basically, you need the hash and a fallback regular URL. The browser is free to grab the content using the hash however it likes, whether it's grabbing it from its cache, using the fallback URL, using another known URL for that content, or using some new delivery mechanism.
Ok. So I didn't miss anything obvious. :-)
This might have been handy back in the dial-up days, but not now. Now I see it as a security and stability risk.
CDNs get the content cached closer to the end user than you are likely to be yourself.
Round trip time is especially important when you're loading several assets in a single page.
CDNs also often provide higher availability than you can provide yourself
Using someone else's CDN is considerably cheaper than serving resources yourself. e.g. Using jquery from google cdn. Bandwidth costs for "small" javascript files start to add up once you have 100 million people loading it each day.
It's more about client performance through using shared cache assets than cost. Though I always worry about someone at a supposedly secure system (online banking for example) thinking its a bright idea as well.
What about old versions? Taking the jQuery url, and changing it to the version previous to the newest release (1.7.2), 404s on me. This is a showstopper.
Ha, you and I noticed the same thing just moments apart. Hopefully someone will reply on this thread to our question, because I can't imagine having to upgrade all of my projects immediately whenever a new version of any library was released.
EDIT: Looks like the URL pattern just changed slightly for jQuery. 2.72 is still around:
Looks really useful at serving the stuff that you can't find anywhere else (and for whatever reason don't want to host yourself), but IMO it really needs a TOS / license - I'm guessing that I can use it on personal sites, but can I use it on commercial sites, can I use it on SaaS sites, or sell sites that use it, etc etc?
Front-end guy here, curious about CDN access on mobile - is it faster to serve one CSS and one JS minified file through a private CDN (like CloudFront), or use something like CDNJS to make concurrent CDN calls with many smaller files spread across a connection?
If you've decided to cripple your browser, that's really not anyone else's fault. I use RequestPolicy with a hand-built whitelist of domains, and sure it breaks some pages, but I don't whine about it because that's my problem.
Users that care, or even notice, are so few that you are noise compared to the number of users that will leave if the site loads slower, so the CDN is likely to be a net win many times over.
When I have see that a webpage has 20 or more scripts attached to it from domains differing from the one website I am visiting, I do by default assume they are tracking scripts from advertisers or facebook or similar ilk.
If a website needs scripts, I expect the website to serve it from a domain which belongs to it. For most sites I visit there are at least 30, sometimes 50 scripts from various sites and domains trying to track me, slowing down my browsing experience, and sucking up my systems ram.
Disabling those causes massive speed-ups. Plus it protects my privacy.
Install ScriptNo in Chrome (or similar for Firefox). You will be shocked by the difference. And you will be shocked by the massive script-abuse currently on the net.
And no, I will not wade through that long list for each and every page I visit, to whitelist whatever CDN you have decided to put in the same trustworthyness-group as doubleclick.net.
If your site breaks with those scripts blocked, I leave.
Install ScriptNo in Chrome (or similar for Firefox). You will be shocked by the difference.
Yes, I would be shocked to be browsing the web circa 1995. Which is why I use a modern browser that supports modern technologies. I also use Ghostery with a whitelist to block tracking. That way, sites work like they should and I can browse this decade's Internet without having to go on forums and let everyone know I'm important and I'm special and every web app should work on Lynx because I said so.
My point was not that all scripts are evil, but if you want me to trust your site, then dont require me to trust 50 domains I do not know for a minescule 10ms potential performance improvement.
Unfortunately he is in vast minority. Only when you start using NoScript or generally paying more attention to your browser's status bar do you start to realise the scale of the ridiculousness.
If I ever ran a media site, I'd redirect noscript users to this:
"We love that you've come to value the great material we provide, but we make money off of advertizers. If you run noscript, that's great! We totally support noscript options for those who purchase a subscription plan."
I noticed, for example, that only the latest version (3.1) of the jQuery plugin Nivo Slider is available. The last versions (<= 3.0.1) are returning 404s.
Given that the previous version was released in May 2012 that's an insanely fast deprecation policy for a CDN.
I have to guess that either I've stumbled upon a bug or minor oversight in this case, because I can't see how removing old versions that quickly would be at all a workable solution for 99% of use cases...