For a REALLY excellent overview of how time on page and average session duration are calculated, I highly recommend this Moz Whiteboard Friday. It answers both your questions better than I could.
https://moz.com/blog/how-do-sessions-work
Edit: If you don't have 8 minutes to watch, you can skip to the final 2 minutes for the time on page calculation.
Very good. Heck, it’s recommended by Jeff Sauer. Doesn’t get any better than that so suppose. See halfway through these slides or google a bit for Jeff’s comments... https://www.slideshare.net/jeffsauermn/google-analytics-for-bloggers-96706111
https://moz.com/community/q/will-301-redirects-same-domain-show-as-referral-traffic-in-analytics
301/302 redirects should preserve the original referral information.
without more context it is difficult to determine how the web developer fucked everything.
Whenever using Filters to (permanently) modify data in a GA View, you should always keep another View unfiltered so all data can be analyzed without the filters. Having an unfiltered View is a best practice and super helpful in situations like this.
If you aren't seeing pageviews associated with the visits, the data is likely coming into GA through the measurement protocol.
To find a solution which works best for your situation, search for articles which specifically mention "measurement protocol" + "Google analytics" + "spam"
Here's one to start with: https://moz.com/blog/stop-ghost-spam-in-google-analytics-with-one-filter
You're seeing what's called "ghost spam." This is spam that is targeted at you, the analyst, not your site users. This article from Moz gives a solution for filtering out ghost spam, but frankly unless this spam is noticeably inflating your traffic numbers, I wouldn't bother. There's much more potential to accidentally filter too much.
That's typically whats called spam traffic, it can be removed via filters or, .htaccess, i only filter data on my Google Analytics, I can't be bothered to do it via .taccess, however this article might help you. https://moz.com/blog/how-to-stop-spam-bots-from-ruining-your-analytics-referral-data
You can try using Supermetrics (http://supermetrics.com/) which allows you to export data to Excel/ Google Sheet and create your own reports. It's free for 30 days . After that, you'll have to pay for the premium service;)
I have experienced the same issue sometimes, and It happened in hacked websites. Usually a fragment of code shows such URLs only if the user-agent is Googlebot and so. This way, a normal user sees your normal website, but a search engine sees different pages served with your hostname and it indexe them.
To check if this applies to your website, you can visit your website changing the user-agent (https://winaero.com/blog/change-user-agent-chrome/).
Disclaimer: Not a dev
Render blocking is to be avoided...it affects your pagespeed score...
https://developers.google.com/speed/pagespeed/insights/?url=lettering-daily.com
Go to the bottom and it tells you there are 2 jQueries that are slowing down page load time. IF you can avoid using them by disabling this option, please do. Your website should speed up. You could re-test the page on the above link after disabling them and hten checking how much does it improve the score .
If you install this chrome extension - Google Tag Assistant and run a recording by loading your page, it shows analytics.js not found...although the page is loading a pageview and an event at 20 sec [adjusted bounce rate]
Why is this showing an error when it also shows pv / event data received in GA? At the moment, I don't know. Bit swamped with work...will try and come back to this thread tomorrow night.
I've noticed this appearing in my address bar on some pages I visit. I haven't been able to narrow down what extension (if any, assuming it is an extension) is causing it, but for the process of elimination, here are my active extensions:
Hopefully someone else can weigh in who may be having the same issue.