tl;dr:
When you set target="_blank"
but you don't set rel="noopener"
then bad things happen:
The new window/tab has a reference window.opener
to the original window/tab. Using this reference you can do nefarious things like redirect the user's original tab to wherever you want!
I've worked in SEO since 2005, I work solely in development now but for 12 years that was my career focus
Whatever you read/do make sure it's up to date
Tactics that used to work in SEO 5 years ago can get you penalized and worse ranking in the search results
Use google's lighthouse tool to see what google wants you to improve on the website...doing so isn't easy depending on the age of the websites stack
Regardless of the above, you aren't likely to drastically improve rankings for a client without link building, content creation, and an overall website hierarchy to point google to your most important pages
Something seems off with these results. 0KB of JS on the old site? That's just not true. To be clear, I'm not saying you are lying, just that your tool appears to be inaccurate.
To be sure, I ran this same sort of experiment with the Network tab of the Chome Dev Tools. Here are the important bits:
new reddit.com | old.reddit.com | |
---|---|---|
Requests | 93 | 143 |
Total Bytes | 2.5MB | 1.2MB |
HTML Bytes | 143KB | 33.2KB |
JS Bytes | 823KB | 614KB |
I did also run it through the Lighthouse tool:
new reddit.com | old.reddit.com | |
---|---|---|
Performance | 57 | 40 |
Progressive Web App | 45 | 36 |
Accessibility | 66 | 47 |
Best Practices | 81 | 69 |
SEO | 78 | 90 |
So, while the new site sends over more data, it does so in a more performant way. That looks to make sense when looking through a lot of the code. They're using Webpack for bundling, and are making liberal and effective use of code splitting. For a site like Reddit, with a lot of distinct sections of the frontend, that makes perfect sense. A huge chunk of those requests are easily cached and the code splitting means they can stay in cache longer (vs one huge script changing on every deploy).
Another key thing is unless you're just hitting one page and leaving, there is a *huge* reduction in bytes going from page-to-page in the redesign:
new reddit.com | old.reddit.com | |
---|---|---|
Page 1 Bytes | 3.2KB | 40.5KB |
Back to /r/redesign index | 0.7KB | 56.8KB |
Page 2 Bytes | 3.5KB | 42.4KB |
Back to /r/redesign index | 0.5KB | 56.2KB |
Page 3 Bytes | 3.4KB | 43.7KB |
The modal overlay for the comments pages has a huge impact on data usage. It's an order of magnitude or two less. That's where the benefits of the redesign start to kick in. It certainly has a higher impact at initial load time, but it quickly makes up the gap as you browse around for a bit.
Put it in the <head> section and add the "defer" attribute.
https://developers.google.com/web/tools/lighthouse/audits/blocking-resources
WAVE, AXE, and Lighthouse are great tools that you can incorporate into your development to see how accessible your site is on the fly! They are all chrome extensions so very easy to use! Would highly recommend wave for testing accessibility.
i realize this isn’t the question but i’d absolutely consider accessibility with that hero. between the bright image and white font, it’s very challenging to read. lighthouse is a great tool to get an idea of how accessible your site is with specific breakdowns. maybe try a darker font color or a dark overlay on the image and turn down the opacity.
A Progressive Web App (PWA) is not the same thing as an SPA, first off. They are totally unrelated. A PWA can be an SPA, but it doesn't have to be. An SPA can be a PWA, but it doesn't have to be.
I assume you already know what an SPA is: a web application that runs on only one page. That is all there is too that, really.
A PWA is a web application that uses features such as a manifest file and service workers to make the website act and look more like a native application rather than just a website. The manifest file provides information such as the name of the application and associated icons, and the service worker does things like caching assets and data, intelligently loading resources from cache, managing push notifications, and providing some offline functionality when the device doesn't have a connection to the website. If the PWA is also an SPA, that's great, but it doesn't have to be.
That being said, Lighthouse was developed to audit PWA's, but you can use it on any website you'd like and it will point out some things that could be useful (although you may have to ignore some errors if you aren't working on a PWA).
I would recommend just checking it out. Google has a nice tutorial on getting started.
I believe they are refering to google chrome's site audit dev tool called light house:
https://developers.google.com/web/tools/lighthouse/
You can get almost the same thing using google's page insghts tool:
The contrast is fine according to google >Background and foreground colors have a sufficient contrast ratio.
https://developers.google.com/web/tools/lighthouse/audits/contrast-ratio
Also, you shouldn't use #000 text on #fff background. https://ux.stackexchange.com/a/23968
You can use Google Lighthouse
You can do it even online this is your website. You can see what is slowing it down and improve 😉
They've used Lighthouse, a website auditing tool by Google, which yeah uses a mobile viewport and service agent. Likely not the best way to compare the two, as they are near identical on mobile.
A good place to start is running an audit with Chrome's Lighthouse in devtools. The homepage wasn't particularly slow for me, but the portfolio page was quite slow.
The lighthouse report suggests some areas that you could improve, I suggest you run it yourself and then look up ways to implement the suggestions.
Put some effort into making your site SEO ready, people who want your service will search for it, if you provide all the information that is required and your service matches what people want, then you will get clients.
With a low budget, there isn't much more than you can do, you could put ads out on things like Facebook or Google, but from there you're going to need to make sure those advertisements are targeted correctly & have to best material they can.
This is more of a marketing issue than a web development one, but since this is a question on web dev then I'll answer that...
Simple list (not catch all) that can help you out (this is a generic list and not specific for your case):
It should be reasonably well known by people who need to know about it.
Tooling, such as eslint, will flag errors like this for you. It’s enable by default in create-react-app at least. https://github.com/yannickcr/eslint-plugin-react/blob/master/docs/rules/jsx-no-target-blank.md
Google Lighthouse will also warn yo with its audit https://developers.google.com/web/tools/lighthouse/audits/noopener
Speaking of, I was running an audit in DevTools today... Apparently Google is encouraging sites to annoy people with download prompts. User Can Be Prompted To Install The Web App
They're technically encouraging a homescreen link to your existing web page that acts like an app on Android, but the difference is kind of non-existent, if you ask me.
Run a lighthouse test on it and see what the issues are. It's showing a 77 for performance.
Looks like you got 2.2MB worth of data just on your home page.
Resource Type | Requests | Transfer Size |
---|---|---|
Total | 102 | 2,235.7 KiB |
Image | 12 | 1,152.8 KiB |
Script | 52 | 763.5 KiB |
Stylesheet | 25 | 221.1 KiB |
There's a lot of different tools out there, but personally I prefer Lighthouse and PageSpeed Insights.
They are both Google products, which might seem off putting for some, but I appreciate the presentation of the results and relevant explanations. For a backend developer like me, trying to make sense of a lot of the issues with client side issues sometimes require a more detailed explanation and some suggestions on how to approach it. Detailed with the blocking times, I can focus on the parts that I'm good at and forward the rest to a front end resource.
Yeah, print will have much higher resolution than screen. If your image element is 500x500px on your site, you'd want the image file being used for that element to be 500x500px or as close as possible. Tools like lighthouse in Chrome can alert you to issues like this. https://developers.google.com/web/tools/lighthouse
You can use Google Chromes Lighthouse audit tools to do a first pass at automated testing.
It catches things like missing alt text, color contrast issues, performance, seo and other things.
> When someone searches for a particular product, is it logical to have every single product load onto the page. This doesn't seem to be scalable and I feel like it would be extremely wasteful loading every single product.
I think you should let user needs dictate the design. Talk to users and find out what kind of UI they need.
> I also feel like it would not be a sustainable solution if the website scales to a million users
This sounds a bit like premature optimization. You might just want to focus on delivering the MVP.
From your other comments it seems like you're learning a few technologies at the same time. I've been there and it can be overwhelming. The technologies that you're using are all scalable and professional. They are probably not the limiting factor. You'll figure out how to make your site performant, you've just got to keep at it.
In regards to load performance, Lighthouse (which I write the docs for) can give you lots of concrete tips on how to improve. The Critical Rendering Path is the most helpful conceptual framework I've ever seen regarding how to think about performance at a high-level.
sorry:) that's Time to Interactive c.f. https://developers.google.com/web/tools/lighthouse/audits/time-to-interactive
In general everything produced by Rails asset pipelines has a content hash in the filename. It's essential if you're caching aggressively since it becomes the only cache-busting mechanism when assets change.
Read https://guides.rubyonrails.org/asset_pipeline.html and https://guides.rubyonrails.org/caching_with_rails.html
Its a metric they have been tracking for a while now, is directly recommended by them and forms part of their lighthouse and page speed insights tests.
https://developers.google.com/web/tools/lighthouse/audits/offscreen-images
Not entirely sure what you mean. If you have the same js file or css file loading twice then just remove one of them.
The ?ver=50 at the end of urls is just for cache busting. You probably want the user to cache the file for a long time, but if you need to update the file it will load the cached version rather than your changes. So adding ?ver=.. to any rescource in the HTML makes the browser load the same file with changes, without having to change the filenames.
Resource blocking, look up how to optimize it. here is a good start: https://developers.google.com/web/tools/lighthouse/audits/blocking-resources
Don't be too caught up in these scores, the tool used is Lighthouse: https://developers.google.com/web/tools/lighthouse/
From a SEO perspective, it mostly checks for the following: https://i.imgur.com/PB3OGY2.png
Are you sure SSR is what you need? Have you already implanted an app shell model on the frontend and done a performance audit with something like Google Lighthouse?
If your biggest concern isn’t SEO, a static site generator and / or SSR might not be your real technical challenges. code splitting, gzip, lazy loading, long term caching, and inlining critical CSS can go a long way towards improving site load times.
Hello. Do you want to work as a designer or frontend programmer? Very few people can do both completely.
As a frontend experience this is not at all bad. Focus on this.
Some tips:
Hey there :) Someone else actually opened an issue with a similar question here: https://github.com/alexzaworski/senseis-top/issues/1
Totally on board with having contributors. If you have WS experience, you may be able to poke through the server and spot things that weren't set up super optimally (I'd never used websockets prior to this project).
If you wanted to try setting up a service worker that could totally be a collaborative learning effort too— I've never used one either. There are some smaller tasks that are perf/accessibility related too, if you run a lighthouse audit you'll find some pretty low-hanging fruit for early contribs.
You 100% want Server Side Rendering. SEO is extremely fickle, so you should aim for the lowest common denominator, which is SSR.
Content visible on initial render or not isn't the only consideration for SEO. Performance is a very big factor. Google haven't come out and said it yet, but all signs now point to their algorithm taking Lighthouse scores into account, which means things like First Meaningful Paint, Time To Interactivity, etc, are all crucial, and will suffer from the delay caused by only doing Client Side Rendering.
There are other benefits to SSR too;
First rule of optimization: Don't optimize early. That also translates into: "Don't optimize without identifying your bottlenecks first."
Suggestion: You probably want to start with lighthouse or similar tools, and first identify the major bottlenecks. You seem to handwave that away, but I'm not sure, if you have actually analyzed where your problem is.
Once you find out that it is actually the CPU or memory usage of your JS, then you can dive deeper into JS CPU and memory profilers. But you want to make sure of that first. For example: If you spend 90% on load time, and 10% on initialization, then you probably want to start with load time optimizations, before getting to profiling your code.
I would start by reading up on SEO, specifically which changes would give you the highest return on investment. There's a tool built into Chrome called "Lighthouse" that can give you a solid list of things to look for.
You don't need NPM for SEO. PurgeCSS might "help", but not in any significant way compared ensuring that the site's HTML is semantic, that URLs are mostly human readable, etc.
I respect your hustle but your post is rather vague. What does your optimization report include? Why would I pay you $20 when I can uselighthouse or one of the other hundred tools. Who are you or what are your credentials?
Run a lighthouse check on your pages!
https://developers.google.com/web/tools/lighthouse#devtools
If you get in all 4 categories (performance, accessibility, best practices and seo) a score between 97 and 100, you maybe on to something - at least performance wise.
These numbers reflect DIRECTLY in organic traffic and bounce rates. Higher numbers = better optimized pages = more money.
Google loves nice code and fast pages and rewards optimized pages with higher rank.
Love the name! In terms of your overall design, I think you need to put more detail into the typography used. You have different fonts and different sizes throughout the site and sticking to the main ones (for example in your logo or brand) would make your site a lot more cohesive and more put together.
Run a lighthouse report as it can highlight some potential factors in SEO, Mobile/Desktop performance, etc. https://developers.google.com/web/tools/lighthouse
It's a tool that measures a website's performance based on the four metrics you see in the post. Since it's included in Chrome DevTools, it's very easy to use and gives insight into the causes of slow rendering, layout shifts etc. Some more info here: https://developers.google.com/web/tools/lighthouse/
I'd suggest trying the "Lighthouse" tool with Google Chrome to judge your SEO score. You don't need any special authority over the site, so you can check it on both your own site and on other stores to see how it stacks up.
It won't do everything (you might also want to look into Search Console on Google) but it'll tell you something at least.
A good place to start is by seeing if your site passes an automated audit using auditing tools. I'm not sure what your role will be in creating the site, or what your budget is going to be, but both Deque and Google offer free browser plugins for checking pages.
The New York Times' homepage fails AA accessibility on a number of counts, including having a non-colour hover- and focus-state indicator for links. If they'd just kept the default behaviour for links (underline on hover/focus), those linked titles would be accessible.
Beyond automated tools, you'll likely need to pay someone with some expertise. The thing about accessibility is that you can't use automation for everything - to give a simple example, automated testing can tell if you've got alternative text for an image, but it can't tell if you've got accurate alternative text.
That being said, automated testing will get you most of the way there, most of the time.
As mentioned above the budget is too low, you could try posting it on r/slavelabour. Besides, it's not hard to check what's causing the site to load slowly. I use Google's PageSpeed Insight & the browser in-built Lightroom to monitor my insights about my blog.
Here's a glimpse of what PageSpeed Insight has to say about my blog.
It’s in your Chrome browser, under Dev Tools. It runs a report to tell you which parts of your website is performing poorly and how to fix it but you will likely need a developer to fix it for you.
Find out more here: https://developers.google.com/web/tools/lighthouse/
> Which means that it crawled my "about us", "contact us" (etc..) pages but not the actual dynamic blog posts
Well of course. You can't crawl dynamic content.
> but it's clearly not fully baked
It is, you just don't understand how these things work I don't think.
> also there's no telling (I actually didn't look it up) if google penalizes SPA that uses CSR
There is, it's called Lighthouse.
> which if you're gonna worry about that stuff you might as well go to SSR and have an easier development experience IMO
Code splitting is super easy to set up. SSR most of the time is not.
I just ran Lighthouse (part of the Chrome browser) on your site--it provides a variety of performance "opportunities" (areas for improvement) and diagnostics, which I would check-out for your site, too.
The initial server response time of about 7.3 seconds, render-blocking resources of about 6.8 seconds, and "enormous network payload" of 4,278 KB stand-out to me as areas to investigate and/or fix, in particular.
I'm not a web dev expert, though, so I can't suggest more, unfortunately, and you would need someone who is a WordPress full stack expert, if this isn't your area of expertise, I think.
There's three categories of SEO imo:
Te recomiendo si no lo hiciste, auditar el sitio con el Lighthouse para identificar si hay problemas por ejemplo de accesibilidad o SEO.
Por ejemplo tienes links que se dirigen a otros sitios externos, sin el atributo rel="noopener" : Links to cross-origin destinations are unsafe .
Ojalá que todo el contenido tenga una fuente confiable de información y que se vea (por ejemplo que tenga el titulo de la noticia, fecha y fuente). Si pasa por fact checkers mejor.
Para saber si el contenido que linkeas es de calidad (en general), pregúntate: 1) ¿Tiene un propósito benéfico o cuál era el objetivo del autor?
2) ¿Cuánta expertiz tiene el autor?
3) ¿Qué autoridad tiene sobre el tema?
4) ¿Qué tan confiable es la fuente?
Honestly it is not gonna be very sexy how I describe it but it is completely local clients in Eastern Europe.
I had no idea what the value of SEO was at first and researched it a bit on the side and offered one of my clients to pretty much write a blog for them to see if I could get any results. Sure enough just following basic guidelines about keyword research and being consistent and having custom graphics/videos on the page brought insane results and from there on I just kept reading post after post on reddit about it and on the internet.
First I would recommend learning on-page SEO being perfect. You can use a developer tool provided by google - Lighthouse and make sure that you have all of that right and then learn about what a good backlink is, how to optimize speed, what keywords are and what should be on every page.
It is complicated in a way to get into but once you do it it's pretty simple, consistency is key. Smaller things like figuring out which businesses need it and which would benefit more from different advertising, how to find good keywords, all of that you can learn in SEO related posts on /r/Entrepreneur , /r/webdev and other related subreddits!
Basically tl;dr do it and research which tools are good for tracking statistics.
Good question. We always bake this in with our build estimates. If a developer estimates a feature will take x time to build, it'll take 1.5x after overhead of writing tests, making it accessible, peer review process, qa, etc.
Perhaps start with this: https://developers.google.com/web/tools/lighthouse
Daca lucreaza cineva la Banca Transilvania sa le transmita ca sunt niste prosti. De ce plm dati disable la paste pe pagina de schimbat parola?
https://developers.google.com/web/tools/lighthouse/audits/time-to-interactive
HOWEVER All this crap is overrated what matters is the reality in production not some abstract numbers
Time it with Google Lighthouse more importantly actually look at the screen and see how quickly the different UI elements appear and ideally get some focus group to comment on the loading speed
If you don't have access to production or can't test in the field it's all smoke and mirrors what you can do is make it as good in ideal conditions as possible (slow down your browser's speed) and see how it feels
Stupid things like a loading logo or an interesting splash screen with a loading spinner can change people's perception entirely even if it isn't really faster, they will think it's faster... job done
My favorite tools:
Results from Lighthouse audit:
Besides looking at your server, run Google Lighthouse https://developers.google.com/web/tools/lighthouse over your site and see if you can solve stuff on there yourself.
I have never seen a 4 in performance..
e.g there is lots of render blocking scripts, your font is causing something weird.
There are also several JS errors in the console.
Id suggest you try to look for a different way to do the video wheel. For such a small page content wise its loading way way too much of everything. I can see you are using one plugin for script/style minifaction but are you caching?
IIRC Lighthouse should give you some links on what metrics it's using to judge, and should help resolving the issues.
What that's telling you is that you're forcing the user to download full-sized images even though you're not using them in their full size. Guessing it's this:
https://developers.google.com/web/tools/lighthouse/audits/oversized-images
It's also telling you that you're not using modern image formats that are more optimized for use across the web.
https://developers.google.com/web/tools/lighthouse/audits/webp
Run lighthouse, perhaps it's to many dom elements, and doesn't really matter what kinds: https://developers.google.com/web/tools/lighthouse/audits/dom-size
Good morning! It looks pretty great imo!
I am a self-taught developer and looking to apply for Jr. positions within the next couple of months - funnily enough I live in Islip 😜
I do have a couple of thoughts for you:
Hmm personally, I wouldn’t give them the time of day, but I don’t doubt there are some hidden gems on there.
You’ve gotta ask yourself “If this theme is top quality, why are they only selling for $50 on here instead of $180+ on the theme store”.
If you’re real set on a theme, run some tests to try and check for anything critically bad with it. Run it through W3C Validator and check for any errors, some errors are fine, or at least unavoidable on Shopify so keep a lookout for things like unclosed elements etc. Also run the theme through Pingdom Speed Test to see if the page load is 4s+. And then finally run it through Google’s Lighthouse which will warn of any critical errors in various areas.
Nuxt documentation, Lighthouse's overview of HTTP/2 (points to some useful links).
> Sometimes I want to just find them on LinkedIn or twitter and ask how they did it
That sounds like an excellent idea.
Other ideas:
Check what open source library your company site depends on, and then convince your manager to let you spend 10-20% of your time maintaining a library.
Use Lighthouse to figure out how to improve site speed. Make sure to instrument the site so that you have before and after metrics so that if there is a noticeable improvement in revenue you can take credit for it.
You probably need a viewport tag within you meta tag. Something like this:
<head> ... <meta name="viewport" content="width=device-width, initial-scale=1"> ... </head>
You can read more about why here: https://developers.google.com/web/tools/lighthouse/audits/has-viewport-meta-tag
This pricing structure is awesome. Haven't seen a breakdown like this before.
​
OP - $1800 sounds about right to me. Site looks pretty good. Some considerations might be:
It will likely make a difference but maybe not in the way that you're expecting.
Google has over time emphasised speed more and more when it comes to ranking websites for SEO. If you can do anything that increases your load times I would recommend it, or at least start with some of the worst offenders in https://developers.google.com/web/tools/lighthouse/
One of the things that will significantly improve your website's performance would be to use a CDN like Cloudfront or Cloudflare. This will get the content closer to your customers and will, therefore, increase the speed, which in turn will improve your SEO ranking.
I hope that helps?
It's from Lighthouse in Chrome Devtools:
https://developers.google.com/web/tools/lighthouse/
I also ran it through Wave and SiteImprove. I got a 100 score right now with Lighthouse and I haven't even done the landmarks yet. Maybe I don't have to or maybe it's not that great, but I'm not getting any errors in Wave or SiteImprove Accessibility anymore, either. When I run any other website I can think of through these two extensions, I see error numbers in the double digits.
I highly recommend testing your site with Google's Lighthouse. I did a quick test, and it revealed several areas for improvement.
Visually, my opinion is that you've designed elements that are separately interesting (e.g. slants, neural network, font choices) but aren't cohesive as a design. (I looked at it on a desktop.)
Regarding the content strategy, a portfolio site should demonstrate unique skills, projects, and personality. A cursory glance tells me that everything I see is very generic. My recommendation is to highlight (i.e. "unpack") your most unique skills and goals with enhanced content. Conversely, your LinkedIn text is far more interesting (at least to me!).
I hope all of the above is useful in some way. Cheers!
You can run it in chrome dev tools, and it gives you scores out of 100 for Performance, PWA, Accessibility, Best Practices, and SEO. And then gives you detailed info about every point so you can fix stuff. You can also run the test as desktop or mobile, and with or without throttling. It's a really useful tool.
Thanks for the feedback.
I wrote 'learned' everywhere because I'm unsure about how to show skill on my personal projects other than the fact that I did it, and did it well (in my opinion). Would it be better to just describe what the project is?
How about this for my work experience:
Constructed the primary IT helpdesk service portal from the ground up using both frontend and backend JavaScript in ServiceNow
Created customized dashboards for ITIL incident and change management used by the helpdesk and application development team
Responsible for working with multiple teams to gather requirements and ideas for improvement
Yes, following up on these suggestions will have a significant impact if performance is important to you.
A good way to see the worst case senario performance of your webapp/website is to use Google's Lighthouse tool. This tool emulates a user trying to load the page you pick on a slow, old, android phone with a weak 3G connection.
Based on the screenshots you have in the post. One of the most effective things you could do is to add Javascript and CSS bundling to this project.
Without more details on how you're developing and hosting this app it's hard to give advice that's anymore specific than what these tests are showing you.
If you're looking to see how you compare, run these same tests against ESRI's sample websites.
> Aside from making nasty divs in divs in divs in divs in divs in divs in divs for the layout and the lock-in.
Well this <em>is</em> a bit of a problem but I suppose if you're only using it for small local business sites that don't get much traffic it's not going to matter much.
I think most people just cringe from a best practices standpoint moreso than a practical one.
A software developer is paid to solve problems. Just because a non-dev can build a basic website on their own, doesn't mean they won't still need someone to come in and fix it.
Wordpress is the bread n' butter for most of the web devs that I know.
A company can build it Wordpress. But it doesn't mean they can maintain it in Wordpress.
That's where a web dev comes in.
I know a guy who charges $175 an hour to fix your shitty Wordpress plugin configuration. And he's never hurting for work.
Wix and Squarespace can only take you so far. If you want something complicated in Wix, well, your performance is going to be garbage.
(Just today, I was doing a design review on a Wix site for a buddy of mine. It took 12 seconds for the first page to load, and overall the site failed a Google Lighthouse) assessment with a score of 10 out of 100.
When it comes to Squarespace, I've helped people move off it because the platform just couldn't handle their use cases.
Most of the jobs in the industry involve 1) maintenance, 2) migration, 3) refactoring, 4) or troubleshooting.
It's actually quite rare to build a completely "greenfield" app for someone (i.e., completely from scratch).
We're paid to solve problems, and as long as there's people out there with issues, you won't have trouble finding work.