Actually, one should also consider using a CDN and possibly not pre-processing the files that could be hosted from a CDN. Even if you don't want to use a CDN, you can still get some of the benefit with etagging. The reason for this is that most browsers and ISPs will have the larger frameworks like jquery cached already (ex: https://developers.google.com/speed/libraries/devguide, or http://cdnjs.com/). By pre-processing the files and creating 1 monolithic JS file, you are most likely creating a unique file that is guaranteed not to be cached for any unique visitor. This means that every unique visitor has to download the entire monolithic file. If you're using jquery, backbone, underscore, jqueryui etc... on a single site, things can add up quickly. Almost everyone has at least some of these cached from one of the larger CDNs these days.
Versioning plays a role here as well. With a single js file, you'll need to reprocess the entire file every time anything is updated. This means that all users will have to download the entire js file again whenever any change is made. By utilizing etagging and keeping the files separate, the client will only download the files that have changed since the last visit (same goes for their isp if it caches, and it's isp and so on). This is very important on mobile where the network operators do a ton to reduce the traffic as much as possible.
As a side note, another common fallacy is that you should always compress your images for mobile. It turns out that this isn't always the case as many times, the mobile isp already does compression and caching. By doing this yourself, you could actually make the file that's sent to the client larger or of poorer quality than what would normally be sent.
> Sure, but on mobile data, the bottleneck is bandwidth. Most client side rendering frameworks are quite heavy.
React is 50kb minified and gzipped. jQuery is half that. And you load those only the first time you visit a site. Sometimes, not even that, thanks to free distribution networks.
If you have a single photo on your page, your client-side GUI library will be dwarfed by it. And we live in an age where many fancy web pages are using friggin' videos as backgrounds for their home page hero banners.
> A lot of apps have very bad data-needed-to-render / rendered-page ratios
90% of everything is crap. What some apps and frameworks do isn't representative of an entire approach to application development.
> I'm not saying CSR does not have a place, and I think mostly server side rendering with a thin layer of client side for interactivity is great way to go.
Sometimes it is, sometimes it isn't. And having a "thin client side" doesn't preclude client-side rendering. Client-side rendering doesn't automatically mean a "fat client side".
This is basically the idea behind Google's Hosted Libraries. If you use them, there's a very good change it's already in the browser's cache. I'm kind of surprised that Chrome doesn't ship with them built-in, frankly.
>jQuery was just getting way too big!
Use google CDN and browser cache https://developers.google.com/speed/libraries/devguide
>When I stepped back and took a look at apps I was creating, I realized that I only utilized a small subset of the jQuery functionality...
So what? It's a library, not the Tablets of Law, use whatever you need and leave the rest.
>Messy code! The convenience of JQuery can easily lead to poorly written code and bad practices such as very long and obscure selectors.
It's your choice to write messy code.
Try using jQuery off the Google CDN (https://developers.google.com/speed/libraries/devguide) and see if your example works. Also check your javascript/network console in the browser for any obvious errors.
Also, you might find it easier and cheaper - assuming a static web page with just some javascript (jQuery) - to run it off S3, rather than an EC2 instance. Have a look at the docs here, http://docs.aws.amazon.com/AmazonS3/latest/dev/WebsiteHosting.html
Angular is now on 1.2.7. This article is based on 1.0.7 which is quite old by now. Angular is quite new and it changes quite frequently so if anyone is interested I would recommend finding a more modern tutorial if possible.
First flaw I noticed: <script src="https://ajax.googleapis.com/ajax/libs/angularjs/1.0.7/angular.min.js" type="text/javascript"></script>
This should be: <script src="//ajax.googleapis.com/ajax/libs/angularjs/1.2.7/angular.min.js"></script>
The rest of the tutorial seems good at a glance though.
does uMatrix block CDNs where JS is hosted? Like.. a lot of websites use google's hosted jquery (https://developers.google.com/speed/libraries/) just because it's faster to load it that way. Blocking that would be pointless and just make your web experience worse.
Also, in theory you can even use things like the google cdn to serve your jquery -- so visitors visting your site for the very first time can potentially load jquery straight from cache, as long as they've visited another google cdn site in the past.
I'm not sure how well this works in practice though. The google cdn urls might be too unique (differing jquery versions, etc.) to afford good cache hit rates. And I can't guarantee performance either (the comparable Yahoo/YUI CDN has failed me a few times...)
In addition to what everyone else said, IIUC jQuery doesn't make it easy for build pipelines to strip out unused code -- in other words, if you include jQuery, you're probably going to get the full weight of jQuery, even if you only use it for one tiny thing.
And people absolutely did use it for one tiny thing. Why wouldn't you? It's "only" 30k, you probably need it for something else because DOM manipulation is painful, and stuff like $.ajax()
is way nicer than dealing with XMLHttpRequest yourself. And before proper JS package managers, instead of having to add a million dependencies by hand, you could just add one script tag and JS was just better.
Now, manipulating the DOM directly is out of fashion for reasons I don't fully understand, but if you're going to do it anyway, querySelectorAll
exists, and Vanilla JS got WAY better. We have package managers, so it's much easier to pull in just a library to wrap XMLHttpRequest
, so why pull in 30k of DOM-manipulation stuff if you just wanted $.ajax()
? (And it's 30k that your build pipeline can't optimize well, so it probably will actually end up on the page.) Even worse, we have fetch()
in JS now.
At that point, jQuery could've faded off into the sunset as a well-liked but obsolete library... except it's too visible for that. Anywhere you see that $
, you know somebody has added 30k to your page, and probably solely for some really nice utility function (something like $.ajax()
) that they're only using because they didn't know about modern JS (or because this code is particularly old). Even just messing around in the console, Chrome gives you its own $
and $$
that are wrappers around querySelector
and querySelectorAll
... unless someone overwrote those with jQuery. So I think that's why people ended up actively trying to remove jQuery.
For common public libraries, I'd recommend using a CDN link to a minified production version.
Here are some examples: https://developers.google.com/speed/libraries/
You could also check other CDNs such as Cloudflare.
If you're using npm to pull in apecific, less-common sites, then you should compile and minify them into a single bundle just file. There are certainly gulp plugins for this. Look up uglifyjs.
<tr>
and not </td>
Here's a jsfiddle with 1 and 3 fixed https://jsfiddle.net/52fpf3ox/, in your real code you'll have to import jquery as described here -> https://developers.google.com/speed/libraries/#jquery (assuming you don't want to self-bundle it)
Just because he mentioned minification/concatenation, one thing I would add is that web browsers typically cache (save local copies of) files that they have downloaded in the past and use those when they can identify that the files haven't changed. If you use links to libraries from sites like this offered by Google, that can reduce your site load time because the user's browser will have already downloaded the file for some other site or Google itself and use the local copy.
Note that using an external CDN often doesn't help as much as you think it should. Assuming people pin their dependencies to a specific version of, say, jQuery (because it's good to assume that people are sane until proven otherwise), that's, according to Google's CDN page, 57 different versions of jQuery that they might choose. That's assuming that every page wants jQuery from Google, when there are many different libraries hosted by a huge number of different CDNs.
TL;DR -- No OP, you do not need a special type of webhosting.
Dreamweaver is a trap, but that is ok. You will learn.
Developing your site on a dev machine, then uploading it to server makes good sense. Don't use unsecure FTP, use SFTP. It is based on SSH and Filezilla client supports it.
You do not need special hosting for HTML5. This is an issue for web clients not the server. Some older browsers do not support HTML5.
OK, DNS servers 'glue' an IP address of a webserver to a domain name. So you need a DNS server (godaddy) and a webserver (rackspace). Lots of domain name registrars like godaddy also offer DNS servers and light hosting products but they are low performance.
You do not need special hosting for jQuery. Your choice is if you want to upload the jQuery libraries to your server and link to them as local files in your HTML or if you want to link to jQuery files that are hosted on Google's servers. If you use jQuery externally the host usually has a good network and they will update the files as needed. https://developers.google.com/speed/libraries/devguide
An old saw about web hosting is, "Support, Performance, Price -- pick two." Meaning you can have cheap hosting but you will have poor performance or poor support.
This is probably the best option for you, or even better use the Google Library API by including the link on their page as above. This has many benefits. Hope this helps :)
Altough it's technically viable, i think it wouldn't be efficient, since Chrome should become bloated with many versions of Flutter JS engine. At the same time, you as developer will have no guarantee about what is the most recent version of Flutter inside your user's browser. So a outdated browser could break your system.
A good choice to Google should be decoupling the Flutter JS file and hosting the core parts at Google Hosted Libraries. So you could accelerate the download of the Flutter Engine directly from Google CDN.
Could you share some link? A snippet is usually a small chunk of sample code to show how things work.
EDIT: aaah you are talking about this here I guess. The snippets give you example code on how to embed jquery on your page. So depending on which jquery version you want you pick the right one and add it to your code.
Unless you work for them, you probably don't know how those sites implements the user tracking or what kinda of information they are collecting. Their Privacy Policy can be written in a way that legally excludes the static sites, but in a language that does not make it obvious. I don't see how trying to reduce my digital fingerprint can be a bad thing. Maybe I just being a bit paranoid.
Also, a lot of sites uses Google api such ajax and jQuery. They often fetch from the Google API static server. I would rather tell less to Google.
Several domains in the list aren't there (primarily) for tracking, but because they offer common code libraries like jQuery and by linking the library from a distributor like Google or Amazon that many other sites will also link, users will experience much faster load times. That speed up comes for two reasons: First, browser caching means your browser won't download the same file multiple times—it stores a copy and when it sees another site requesting the same file, it just uses the copy it saved. Second, when it does need to grab the file, it's coming from a separate, dedicated server and doesn't rely on the original server for those files. Google and Amazon and other big names have better, faster infrastructure in place than most lesser known sites, so loading as much content as possible from the big name servers is going to be faster and more reliable, and also save the little guy money (many site hosts charge based on the amount of data transferred).
>Use CDN to load the page assets
Isn't that mean that you should move all third party scripts and css to some public CDN like https://cdnjs.com/ or https://developers.google.com/speed/libraries/ ?
I also suggest to move all pictures to sub-domain lets say img.productivity.mobi or static.productivity.mobi or even assets.productivity.mobi Maybe you need to move your custom css and scripts too.
Try to manage your domain with cloudflare. They gives you free SSL and some basic DDOS protection.
You will find that it is VERY popular to serve jquery and other common libraries not only from CDN but from a CDN out of an organizations control like: https://developers.google.com/speed/libraries/ .
What you gain here is that even the first call to the script from your first load is many times cached and a no-op.
What I dislike about this are:
JavaScript comes bundled with your browser, no need to download :)
You can download jQuery in a variety of ways the easiest is to use a CDN; https://developers.google.com/speed/libraries/, just copy and paste the <script></script> tag into the bottom of your HTML document, but before the closing </body> tag.
Or you could download it from jQuery's site, put it in a folder, and link to it from your <script src="link_to_the_jquery_file"></script> but using Google's CDN is easier.
Hi. It should probably be noted that this'll only get you updated to about last year in June. I don't plan on keeping my side up to date. Check https://developers.google.com/speed/libraries/devguide for new version paths you might have to add to libraries.txt manually.
Google hosts a version of jQuery you can include in your page: https://developers.google.com/speed/libraries/devguide#jquery
Just include one of the script lines (<script...) there in the header of your html file depending on which version you want to use.
For one thing, jQuery helps solve several browser incompatibility issues. For another, it often makes things easier than pure JavaScript.
If you don't want to use it, then don't. If you do, then I recommend using a CDN to serve up the file, preferably a source that is used widely so that it is likely to already be cached in the user's browser (perhaps Google or jQuery's MaxCDN versions.
> We thought it would be faster if we put it on Google's servers...
Reddit never put the jQuery library on Google's servers. jQuery is offered by Google as an option among their many hosted JavaScript libraries https://developers.google.com/speed/libraries/devguide.
So, these are comments on the site, and not the actual business, since it's what I know as a developer.
The bottom parts move up and down as the testimonial quotes change depending on the length of the text being displayed.
You're also sending and loading ~35 JavaScript files when you open the page, which is why it takes a bit to come up. One thing on the page code is that you're hosting the jQuery.js file, which is the largest of them all. If you use the one from Google, people will probably already have it cached as many other sites already use it, and so they won't have to download it when they hit your page and saves time. Also with tons of JavaScript, I'm guessing this site won't perform super well on older computers with how much time is spent rendering and styling the page.
Now, I'm definitely not a designer or anything, so take the following opinion with a huge grain of sale. I can't help but feel that this sort of site layout is getting highly overused. You have to do lots of scrolling down to get the info that you want, and personally, if I don't see the relevant links in front of me as soon as I hit the main page of a website, I tend to go somewhere else. The bar the slides down after you scroll a bit is useful, but personally with all of these pages that look and work exactly the same to me, I probably wouldn't even give it the chance to come up. Again, this is personal opinion, but I figured lying wouldn't be useful to you either. Lastly, with the stats you list, you should probably list a source for those. Whenever I see stats on a website without any source, I am instantly leery of what I'm being offered.
>How do I add said jQuery library?
The same way you added script.js
. You can download your own copy of jQuery or use a copy hosted by a CDN (e.g. Google)
>And I fixed those errors, thank you.
No. That's still wrong. Do you understand how opening and closing tags are supposed to work?
Think of them as brackets. They wrap around a bunch of other tags.
This would be a valid way to wrap something in brackets...
{[]}
This would be invalid....
{[}]
You cannot close a set of brackets so that the first set {}
closes before the second set of []
.
Now think of left brackets as opening tags e.g. <body>
and right brackets as closing tags e.g. </body>
.
There are also certain tags that can only exist inside certain other tags. <head>
tags cannot exist inside of <body>
. They must exist on the same level as <body>
, within the <html>
tags. <head>
tags must always come before the <body>
tags.
Did you leave your div named demo? If so, leave it as is. Add a jqery script tag to your page (after all the HTML): https://developers.google.com/speed/libraries/devguide then take that line of code (the one you pasted in this post) and put it into a script tag in the head of your document.
My guess is that you haven't included a reference to the jQuery library. Copy and paste from the "jQuery" section on Google's hosted libraries.
First I would probably use the google hosted libraries. https://developers.google.com/speed/libraries/devguide
Then next I would get an object filled with data. If you use jquerys' .ajax function it has a converters property which should convert your xml to json. Then you can access your data using dot notation. Also your xml doesn't look correct as you have a match tag which should have a closing tag somewhere, which I'm assuming you left out. You technically don't need to have individual tags for players, maybe just an away player tag and home player tag. But if how your school sets up its xml the way you posted you might have to do a regex of some sort on the tags.
The google box actually caches a hell of a lot more than just youtube. For example, their javascript library CDN, anything served from *.gstatic.com, dl.google.com, etc.
>and I just stumbled upon this minified jquery UI file which has been customized to fit the website.
Kill it and re-download the latest jquery UI (complete at least until you figure out what limits on the jquery UI you need)? - or use google's CDN?
Google can serve their hosted scripts over SSL if you want. Their default URL is protocol-relative (leading "//") so it will automatically use SSL if the page it's on is using SSL.
So you'll need to include jQuery which is a javascript library used for manipulating the DOM. You can download it and include it in your project, or host it from a third-party CDN like Google Hostel Libraries .
Then you'll want to set up something similar to this.
Not just non-existent, but inaccessible. Services like Google Fonts, Google Hosted Libraries, and ReCaptcha are unavailable in mainland China, making a lot of US-based web sites unusable.
Yes and it's actively being used by malware. They generally don't replace entire sites like in your example. What they do is replace URLs that are commonly used by many websites, such as Google's hosted libraries or URLs where ads are loaded from. If you were to replace for example the jquery script located at google's cdn, then every website that uses it would load your malicious code. And there are many that do. :(
Then you would spam them with ads and be able to steal some info.
We agree.
Every time i see a "google is evil, i want them out of my life 100% posts" i have to roll my eyes a bit.
> As for Google, it suffices to not be affiliated with it and block its requests.
Blocking all requests to google is going to be difficult. You can piHole for your home networks and a rooted /etc/hosts
will go a long way... but neither of those are going to be 100% UTD. At some point, there will be a new domain that's not yet in the latest version of the block lists.
Additionally, Android uses google's servers for all sorts of things (testing the performance / openness of the wifi/cell network you're on, testing the DNS over TLS server you're using (if on android 9+)).
While i'm sure it's possible to root these out of the source entirely, that's usually not in the technical ability of somebody that's asking "how to make google not exist".
Even if you do manage to block 100% of network requests to google 100% of the time, the sites that they partner with (read: buy or trade data with) are still going to leak some data about you to google.
Reddit likes to load JS from amazon ad system and googles tag system, but there are loads of other websites that you risk breaking because they use google's hosted JS libraries. Note, google does not develop most of the libraries @ that link, they just host them. I don't think there's a huge risk to privacy in having your IP show up in the request logs (if they even keep 100% of the logs...), but op asked how to remove google entirely... and blocking connections to google's CDN would be part of how you achieve that.
If you mean the google apis link, then yes. It can also improve your web performance for people who already have the files in their browser cache. Read more about it here https://developers.google.com/speed/libraries/
> You are potentially paid per line in a sense if the payload size matters, which it often does. Also CPU cycles on weak mobile devices. jQuery is bad on each of these counts.
I can immediately demonstrate your first claim is false, because jQuery, like other popular libraries is often served from a CDN, like Google's: https://developers.google.com/speed/libraries/#jquery
The payload is therefore effectively 0, because the first site to use it will result in the browser caching it, and modern browsers even cache the parsed opcodes and runtime performance analysis for a library like this.
Compared to a custom solution, like the snippets posted in response to my challenge, the payload would be more than zero. And you know, more than zero is worse than zero. Q.E.D.
As for your second claim, [citation needed]
. jQuery always falls back to native APIs when possible in which case the extra "CPU cycles" amount to basically nothing. And for the situations when it can't fall back, then the native implementation would also be a lot more complicated.
That's a recommendation I don't happen to agree with. If your images are all very large, you might simply be better off with resizing and/or optimizing them on your own server.
As for your CSS and JS, anything that's not custom-written may be available on a CDN already. See Google CDN and CDNJS.com, for instance.
Are you talking about this? Otherwise I'm not sure what you mean. There is no downside to using their hosted libraries, though cdnjs may be a bit faster.
Yeah, 3rd party scripts don't have to be bad generally. For example you can include the most used javascript libraries from google to save bandwidth on your server and improve load times on your website (because the library is most likely in the cache of the user already).
A negative consequence of this practice is however that google can track you better. A good and trustworthy site should host all their javascript on their own servers.
And then there is also problem of advertisement and cross site scripting...
Upload the script to your site assets folder, open and edit the NewForm.aspx page and add a content editor web part. Reference the uploaded script in the web part (server relative so ../../SiteAssets/myFile.js). Also make sure if jQuery is not already being loaded on the page that you either add a reference to the jQuery library from google or save a copy to your SiteAssets and reference it there. Also that script above probably needs to be sitting inside this:
$( document ).ready(function() {
// YOUR CODE HERE
}
Okay, I may have figured it out what it was.
It looks to me that you're hosting your own scripts? Maybe you should use one of google's hosted libraries? seen here
Another factor: You say your images are small but thats not true. "Cover-New-bw.jpg" = 869 KB!! thats almost a meg. for a background? no way.
"LVSG-Thumb.jpg" = 638kb
"Thumbnail.jpg" = 763 kb?
For Thumbnails?
It took me 6 seconds to load your site and I'm using fiber optic internet.
Here's what google inspect says
Try using the time-line feature and you can see where certain bottlenecks are Like here for example
Hopefully this helps!
Or the DNS many wifi hotspots use 8.8.8.8 - googles dns https://en.wikipedia.org/wiki/Google_Public_DNS
Also everyone seems to use angularjs or jquery from google so you still may get tracked via your browser fingerprint. https://developers.google.com/speed/libraries/
In this example, you didn't add jQuery. (If you click on the gear icon next to JS, then go to Quick Add, it lets you add jQuery). Maybe you didn't add jQuery in your site as well. /u/glethro's comment made me realize this. When you add jQuery to the pen above, your code works fine. Get a link to jQuery using Google's CDN. You can add it before your scripts at the end of your HTML, before </body>
, and it should work.
/u/paranoid_twitch gave a great answer to get you up and running immediately.
Here are some more libraries hosted by google (look at the right sidebar) and some information on their cdn.
I won't give you links to specific tutorials but I will give you an idea of some of the things to look at
I have used codeacademy before and thought it was a good start, but it I dont think it gives you a thorough understanding.
Stackoverflow is a great place to ask questions and find solutions to problems.
Hope this gives you start
Not sure if this makes any sense at all since I'm still pretty new to web dev, but I had the exact same issue last week. Adding jquery from my project folder didn't work, but when I used a google link, everything worked fine.
https://developers.google.com/speed/libraries/devguide#jquery
Try that maybe?
I added exactly this to my html file
<script src="jquery.js"></script>
<script src="imagelightbox.js"></script>
<script>
$( function()
{
$( 'a' ).imageLightbox();
});
</script>
Not sure how to format code on here for posting. I changed "imagelightbox.js" to "js/imagelightbox.js" to match the location the file was put. I guess my answer is: Unless that's what the first line up there is doing (which I thought it was), then no I didn't. How do I do that?
Edit: BTW I won't be surprised if I had to modify some code here and didn't realize it and that's the issue. I was trying to follow the instructions he had on his site and saw nothing about modifying any code to get it to work so I just added it in as-is.
Edit 2: Do you mean this: "https://developers.google.com/speed/libraries/devguide?csw=1#jquery"?
Sorry if these are dumb questions. My practical experience with jquery is pretty minimal. Most of what I learned came from Codecademy.
Aside from consolidating, minifying JS and CSS, try doing a waterfall test and see where there are hangups. If it's a third party script/service, consider contacting them about options for speeding up. Sometimes a script can visually hang the page. You can also try, especially with your own scripts since you'll have more control over them, loading scripts in the footer so the page visually loads faster. For common libraries, such as jquery, consider (if you haven't already) using something like Google's hosted library instead of loading it from your own server.
If you have a large project and anticipate a lot of visitors and requests, you may also consider a CDN.
Yep, that's what I'm thinking of. I think baking in a VM that only really works well for one language or a very specific kind of language is a poor solution to the problem of having to ship your own run time with every application. A much better solution would be to put the run time for the language in one place and make sure it's cached client side, so that it only needs to be downloaded once for all application in the same language. Like Google does for JS frameworks with its "Google Hosted Libraries". https://developers.google.com/speed/libraries/
Yes, that was intentional. It's how Google says to link to it:
https://developers.google.com/speed/libraries/devguide#jquery
I think it has something to do with making it accessible whether it is through http or https.
>Can I ask what the real intent of having your own TLD really is? Is it meant as a poor mans webfilter? I would assume the administrative effort and resources to do so is greater than a more conventional webfilter, right?
At the company I work for, it's simply a matter that most software assumes DNS exists and functions appropriate, but we have zero reason to expose our internal networks to the internet at large. The other parts are, as you say, filtering. It's lighter for the DNS server to reply with a negative result than for the web proxies to examine URLs. Marginally so, but still. (ETA: I just reread my prior comment and note that I put two ideas adjacent to each other in a way that might be confusing. The primary purpose of having private TLD is the fact that it's private and by not using mycompany.com, it's really easy to keep clear what is where. The web filtering is simply easy to do. It's certainly not a comprehensive ban.)
>And this is massively off-topic, but while I'm at it, I figured I'd ask about the google bit... If you are so opposed to their business model, do you continue to use their services? if so, why?
If I go to www.google.com, then I have no problem with them logging that. However, if I go to some random website that is based on jquery and the website has chosen to leverage Google's Hosted Libraries service, it's not me using Google's services and I feel no obligation to provide them with knowledge of the fact that I have visited some.random.website.com. I can provide my own copies of those libraries hosted on a machine on my LAN. It will load quicker, won't traverse whatever portion of the Internet normally exists between me and Google's CDN, and also provides me with a bit of privacy about my browsing habits.
For the record, not that I think anyone cares, I normally use Bing as a search engine, simply because Microsoft is not Google.
Any page on the web that uses Google Hosted Libraries (jquery etc) reports your usage to Google. https://developers.google.com/speed/libraries/devguide
It's an ingenious, and devious, way for them to know who's going where on sites they don't control.