This is about CORS - it’s also known as a “first party cookie” aka cross domain analytics - read how folks do it via segment.com @ https://segment.com/blog/introducing-cross-domain-analytics-unify-customer-profiles-across-your-brands/
To do it - you just pick ONE domain for the entire company and then append all domain cookies to that and add that domain to the cors xss headers to prevent cross site scripting while allowing the cookie to be appended to a 3rd party domain. It requires a dns tweak and a JavaScript tweak but it’s pretty standard enterprise architecture.
I build analytics stacks like this - Facebook just rolled out a 1st party cookie to help get around safari 3rd party blocking - read about that @ https://marketingland.com/facebook-to-release-first-party-pixel-for-ads-web-analytics-from-browsers-like-safari-249478
Google definitely has a cross domain cookie to stitch all domains to one central 1st party domain. This helps to track users across domains, ensures you don’t have audience splintering, and helps with a variety of Lifetime Value churn attribution models.
Fun thread :)
Segment's engineering blog is a favorite. The article linked and https://segment.com/blog/spotting-a-million-dollars-in-your-aws-account/ are both great engineering + cost reads.
I use a lot of different things: grafana/kibana, Mixpanel (forgot to add that, editing post) + self built reports build in segment.com fed events (goes to postgres warehouse).
I noticed i forgot a few more
Those "I have one tool in my toolbox" approaches are always a red-flag, Segment went from a Microservices architecture to a Monolith to fix a lot of their problems. Like agile, I find microservice design to be used as a crutch to try to fix project management and business systemic issues without actually having to address them.
Ultimately, how we design systems has their pros and cons, and they have different applications depending on your needs, if you choose your approach first and don't try to analyze your problem space adequately you're likely just wasting time and money.
If you have someone coming in telling you to use one of these design patterns without asking questions first, show them the door.
Jesus that would drive me up a wall. How many engineers work where you are? Can you rally together and ask for a change?
I used to work in a noisy open office where customer service and marketing managers would take calls and have lengthy discussions during the day. The CS agents would constantly interrupt engineers to talk about features they wanted.
My manager asked for us to be moved into our own room, and when that happened, it was amazing. The engineering team was way more relaxed and our conversations stayed on topic. Productivity went up and it was all a result of being able to have silence without being interrupted.
Segment.io studied the noise levels in their workplace and found that their engineers are more productive in quieter areas (source).
I think my manager actually emailed that ^ specific article to higher ups and the move was enforced because the entire team was on board.
I found links in your comment that were not hyperlinked:
I did the honors for you.
^delete ^| ^information ^| ^<3
I think you have a few misconceptions. Don't worry, you're not the only one. Most other folks have these same misconceptions. I'll try to help:
Myth #1: "Microservices are useful when your project needs to be highly scalable." This is not true. Microservices can make the problem worse in many cases where you need to communicate with other microservices. This introduces more latency. Scalability is unrelated to microservices architecture.
Myth #2: "Companies that use microservices are cool. " Most likely not. It's just that they think they're Netflix when they aren't. It's a cargo cult. If they're doing microservices and don't have at least a 200+ developers on that same project with multiple data repositories, then they're just doing it for the helluvit and probably making a mess along the way. I've interviewed/been recruited to several companies like that, and I run as fast as possible. Most recent one was 3 guys writing dozens of microservices all hitting the same database and running from the same VM because "it's like, microservices dude. It's gnarly fast!". I've talked to name-brand Silicon Valley companies with 1000s of developers from India and think microservices can make their systems scale better, but still want to use a monolithic relational database.
Yes, you always build your own home project using microservices. Just come up with your own scenario (ex: a fictional banking system). Microservices are really useful when you want to enable developer agility and have a large number of developers working on one large project. Or, if you want to individually update functionality faster than the other features that other teams are working on. If you're a one-man-band, it's going to be hard to see the benefits of microservices.
You COULD use Oauth, but service accounts are definitely the right way to do it.
You ask your users to create a new service account and give it the minimum permissions you need to get your work done.
For example, here is how Segment gets access to write into BigQuery: https://segment.com/docs/connections/storage/catalog/bigquery/
I found these two articles give better info about pointer performance
https://segment.com/blog/allocation-efficiency-in-high-performance-go-services/ https://go101.org/article/value-copy-cost.html
I completely agree with this point. Tools are irrelevant, that is why I am more focused on "what" to define for the team--- I have some opinions about tools we can and should use, but my primary goal here is to make our Platform-- and dimensions including individual Applications-- as transparent and accessible for data discovery by as many users in our company as possible.
​
To your point, I do like Segment.com because we are an existing customer and I suspect we are not using that tool to its potential
If you don't know yet, I'd suggest checking out Segment. Integrating a single data warehousing platform is a great solution to switch out marketing tools quickly. It provides 1 source and a few destinations (like Google Analytics) for free up to a certain point. It gets more expensive quickly, but it's a great starting point in my past usage.
Hope this helps
I am happy with the tool, but I am not a power user of it. I only use it to get secrets and transform them into environmental variables in my docker containers. For this use case it is great.
I found it from reading the Segment blog.
You basically can not run a software or internet based startup these days without doing those kinds of things to your users. If there is profit to be made from a user using your application or visiting your website then those kinds of things are part of your marketing strategy. It's so popular and there are so many different service providers that there is one company that will broadcast your tracking data to all the various providers and tools you use like Mixpanel, Google Analytics & Adwords, Intercom, Mailchimp, Salesforce, Stripe etc.
You simply send them all your user tracking events and they forward them off to everyone so you can use your data in multiple tools without worrying about integrating with them all separately.
KSUID implements K-Sortable Globally Unique IDs for PHP. For background see A Brief History of the UUID blog post from Segment.
depending on your development team bandwidth you could use segment.io to fire dynamic remarketing tags for all the different accounts. https://segment.com/docs/integrations/adwords/
Or you could fire the tag X number of numbers for every account.
Note that if your users adwords for reporting your viewthrough conversion tracking numbers may be over counting
Mainly we've been using Mixpanel along with Segment.
I'd say I both love and hate both services. They both do some really cool things but have been a headache to setup. One thing that particular irks me about Mixpanel, and really most analytics tools, is they seem to assume you'll set everything up right the first time. It would be really nice if I could say change event names once I realize I need more events. Also mixpanel lacks an idea of a group a user belongs to like a company which means if you want to track company information you'll need to hack in a special company user.
This article provides some more details on this, see the section "Interfaces and You". But I'd recommend reading the entire article. The "Takeaways" section at the end is also a great summary of tricks to avoid allocations.
I found that answering 'we need to keep track of X' or 'we need to be more data driven around bla' is almost impossible as the conversation can take you anywhere.
What I do is very carefully try to write down the answers you want to see answered. For example:
​
These questions will push you in the right direction.
Myself I am using segment.com (analytics.js) to trigger analytics calls that then go to mixpanel custom reports. In addition we have some prometheus reports with simple reporting like Active Users per Week, etc...
If you write down your questions I will reply/suggest which tools to use to solve it. What is your stack today?
Is the code shared between clients? Is this a multi tenancy problem where you’re trying to deploy the same application for multiple clients but they are slightly different? Or are you deploying similar applications with the same shared code?
This is a software architecture problem, to answer the question I’d need to understand a lot more about the domain.
As a good starting point for you to learn here is an article from segment about what not to do https://segment.com/blog/goodbye-microservices/
Check out segment to send one analytics event and forward it to many others behind the scenes.
Also, there are so many web development jobs that don't require creating high performance websites. Check those out.
Hi u/kkurtzz had the same problem but didn't find anything that cuts it so I started building my own.
It works like this:
This combination allows you to do pretty much any analysis on channel attribution you want. I blogged about this @ https://www.lifelog.be/day-10-six-months-later
The code is open-source (you can use/copy/borrow) and I am in no way trying to make money off this.
I don't believe Segment stores event data like tools such as Mixpanel and Amplitude, which stored the analytics events sent and enable dashboards to view that data. I think with Segment you'd have to send the events to a destination via something like AWS Firehose and load into a database, which can then be accessed with any number of reporting tools.
More info about Segment here: https://segment.com/docs/guides/
I have no connection to Segment but I do work in this analytics/data space with similar tools and have evaluated Segment.
Not a pitch but I just thought a useful answer to your question, we use our own product GoSquared. It is accessible and filterable via our UI but also available via our API in JSON format.
I know Segment has the ability to query your your data with SQL
https://segment.com/blog/sql-traits/
I know the landscape pretty well so based on what you were looking to do or use the data I could help with recommendations
Let me know if that answers your question at all or I can be more specific
Use Segment.com and set up your events in Segment. Data layer can largely be managed by Segment also —> https://segment.com/blog/what-is-a-data-layer/
You don’t want to rely on google products - they aggregate data so you can’t often see user level behaviors - and GTM pixels / tags often don’t fire on Safari for various reasons.
segment costs a little but once you set up events, it automatically powered dozens of amazing analytics products (amplitude and full story i like).
It’s great while you are small snd don’t want to figure everything out. If you have a high columns site woth low value users (like a news website) then it’s cost prohibitive but great for most use cases.
[]*int
is a slice of integer pointers. That is, while []int
is a slice where each element in the slice is an int, []*int
is a slice where each element in the slice is a *int
(an integer pointer).
For the faster part, refer to allocation efficiency
This might be of interest to you: https://segment.com/blog/allocation-efficiency-in-high-performance-go-services/
> The rule of thumb is: pointers point to data allocated on the heap. Ergo, reducing the number of pointers in a program reduces the number of heap allocations.
I think what you are looking for is a Customer Data Platform. The team at segment recently published a throughout article about this type of software: https://segment.com/blog/customer-data-platform/
Personally, I think you can first start different services that integrate with each other. E.g. Intercom for Helpdesk and automated emails, Chargebee for subscriptions, etc.
For the surveys, I would go with a dedicated solution like Qualaroo, Refiner.io, etc. (Disclaimer: i'm the founder of Refiner) and then push the responses back to your system of record (your database, Segment, Intercom, ...).
#marketing_and_code
Thank you for initiating this. I would LOVE to join! It's something all my stakeholders talk about but few understand the difficulties of implementing it.
I have work in Marketing Analytics for 3 years for a B2B SaaS company. Job involves all marketing tracking, strategic recommendations, controlled online experiments, strategic recommendations, and other analytics needs.
Have implemented a simple linear attribution model given the incompleteness of our data. Stakeholders took it pretty well. Challenges based on my experience:
- Accurate and time-stamped engagement data across platforms especially if you want to use the Markov-chain model, which is the holy grail. Segmentis one platform I know that can hugely help with this.
- Get stakeholders' buy-in and get them to make decisions based on the output. Sometimes the truth is too harsh for some...' Influenced' is all they want :)
The article linked in this article to segment's centrifuge is more interesting than this article, by miles.
This article doesn't tell us much of anything to be honest.
I found a service called segment.com that has a node.js NPM plugin - using it now and seems to work but only free up until the first 1000 tracked users. But a good start at least.
Hi,
First time poster, long time lurker.
I found Segment's 'Analytics Academy' very useful. I am a brand consultant and have used their approach on many brand strategy assignments. I think you will find articles like 'Product Market fit' an interesting way to inform your choices for a digital customer touch-point, or as a customer retention strategy. Or I can see how this post on 'multi-touch attribution' might help you think through how to value each of your brand assets from a customer acquisition perspective.
I also rely on key future focussed business insights to inform positioning from Mckinsey insights.
intercom is a suite and hence expensive, so if you need only the event base campaigns, go for a best-of-breed solution. In any case, you need some development to integrate your backend to a service. I guess the best would be to use a connector such as segment.com and then connect that e.g. to customer.io or mailchimp. keep in mind that these are not transactional messages though so if you need immediate emails, check out sendgrid or anything like that.
Mode. It's a cloud BI tool, that works well with Segment (assuming your Segment data is going into a database somewhere). Mode has a free tier. You'll need the credentials to access the DB, but < 1 hour for any backed dev to get you up and running.
Not sure why this was downvoted. I can only assume /u/moonjazzz was talking about copying structs that contain pointers or maybe dynamically-sized slices/maps. Why else would the GC be involved?
The quote below from the GC developers themselves alludes to what the best approach is.
>Users are getting more clever about embracing value-oriented approaches and the number of pointers is being reduced. Arrays and maps hold values and not pointers to structs. Everything is good.
https://blog.golang.org/ismmkeynote
For anyone who doesn't understand the relationship between pointers and the GC, there's a nice write up by Segment on allocations.
The SQLite database is local to each host and gets updated asynchronously by ctlstore which is backed by AWS’s Aurora.
Segment has an open-source one.
https://segment.com/blog/how-to-build-consent-management-into-your-site-in-less-than-a-week/
Generally would agree, but in this situation from reading the answer in the link you provided it’s unclear what they’re doing with Segment. If you check the Segment Catalog, it lists what Segment can pass data to and it doesn’t look great.
It sounds to me like you are tackling your situation in the wrong way. Instead of just pulling the data together you need to take a step back and understand the needs of the company and different consumers.
You need to first understand the company's core KPIs and make sure those can be tracked and reported on in PowerBI.
As time has gone by the social media platforms have degradated their APIs to the point where very little can be learnt. Within sufficient history data you can't look at trends which is the bread and butter for performance analytics.
My recommendation would be to first understand the key KPIs of the main departments and tackle those. If you must focus on the web analytics side of things then I'd suggest making sure the standard things are being tracked correctly in GA. These include goals for conversion so you can track the performance of your ads and the usage of utm tags.
Depending on the business, you might want to consider Segment.com. This tool will help you build better intelligence on users and visitors and can help you build a multi-touch attribution model. I cover this approach in this article you might find helpful - https://www.projectbi.net/multi-touch-attribution-implementation-guide/.
Good luck.
https://segment.com/blog/scaling-nsq/
​
Totally different messaging semantics. Sounds like they want something with service bus like principles, SQS would be too simple.
People really underestimate the effect that sounds can have on an engineer's productivity. This is an old article, but they did an experiment on this by leaving ipads that track noise levels around an open office: https://segment.com/blog/how-we-added-10-people-without-hiring-a-soul/
Agreed @klysm. We made this unclear in the earlier site, but the problem of unifying all data alone is really hard (and segment.com is doing a great job at that). We're specifically interested in solving a specific need for startups - seeing all your users actions, quickly. User data tends to be spread across services, and like many other tools, we pull in data from various sources to build a nice interface for PMs, or support agents. we're definitely planning to have an API you can use to grab data from Windsor, simplifying tools you want to build, or if you'd like to unify data across sources in some sort of Zapier fashion.
Check out https://windsor.io , I'd love any more feedback you've got :)
I recently had a first/last attribution solution implemented that may work for you. Its not cross domain but im sure it can be easily made to work for that as well.
On first visit save all the tracking info and UTM parameters as well as unique ID to a cookie thats available to all properties. Initial LP / Initial referrer (never gets overwritten), all UTMs, page paths (as visitor browsers the site) and then last referrer / last LP for future visits.
This cookie will always be available to send all this data as well as identification info (once lead/registration forms are filled) etc to segment.com. Have segment then send specific data as user properties to any other destinations or tools that you want like mixpanel and GA or your CRM.
yes I understand your reason, In germany we have sometimes by this issues in data privacy concern a lockdown symptomatic, you cant move and dont find a solution.
There are solutions like Remotestorage, but less known, implementation is not easy or its not reliable for the company. Each business need stable income, this can be made by ads or payment, but even by analyzing user interaction. Notion is using https://segment.com/ for this, at least the WhatRuns Extension is telling me.
So you can use a RemoteStorage if they provide, but for the company is no pressure on you, because you own your data.
I think it so, if there is no better solution, develop a own one.
you may want to use something like https://segment.com/ to track where user's are in the purchase flow before they bounce. That may help give you some insight into why they leave. Site looks fine to me. I'm not a designer though, just a programmer. :)
This is amazing! My favorite part is
> We must give people more control over how their personal information is collected, shared, and sold — and do it in a way that doesn’t lock in massive competitive advantages for the companies that already have a ton of our data.
GDPR burn!
explanation of burn
The GDPR is european law that strictly regulates how a company can handle user data. It's a manifestation of the idea that each person owns her own data, therefore tech giants must respect that ownership by anonymizing it, requiring legal justification for how the data is used, erasing it when asked, and notifying the public when there's a data breach.
The problem is that all these requirements effectively require data gatherers to lawyer up, which isn't inherently bad, but it means that Google is more able to comply with the law than a college dropout with an idea. The law has inspired companies that specialize is GDPR compliance, but that's just another roadblock between the entrepreneur and the marketplace.
Warren is arguing we're better off focusing our regulatory power on the big offenders. She explicitly states, "companies [. . .] with annual global revenue [. . .] between $90 million and $25 billion [. . .] would be required to meet the same standard of fair, reasonable, and nondiscriminatory dealing with users, but would not be required to structurally separate from any participant on the platform." That's a huge leg up for the little guy, which makes sense. She's essentially saying that you don't have to bother complying with all these super complex laws until you can afford to.
Do you have a high level understanding of what segment is and how it works? If so, this is simply as easy as adding GTM as a destination in Segment. https://segment.com/docs/destinations/google-tag-manager/
On the topic of unique id's: I recently started using this ksuid package instead of a RFC-4122 UUID package: https://github.com/segmentio/ksuid
I was inspired by reading & using the Stripe API: all of their identifiers have a short prefix that describes what they are:
evt_1CiPtv2eZvKYlo2CcUZsDcO6 - an event
ch_1D2WUx2eZvKYlo2CtOha79tm - a charge
Unlike RFC2122 UUID's, they are a little easier to copy & paste, which is a small but handy detail.
There's this blog from Segment that has a fascinating history of uuid's, leading up to why they created the ksuid package. https://segment.com/blog/a-brief-history-of-the-uuid/
Yeah, I agree. Lots of people are blocking Google analytics. I haven't tried this, but perhaps look into Segment. It can do server side analytics: https://segment.com/blog/the-way-server-side-analytics-should-be/
What do you mean 'was not meant', https://segment.com/docs/spec/track/
So forget about simply collecting events, do some AI where you would generate heat areas, rage areas, problem areas (user for example has closed a page)
aws-vault login
can do this, so at the command-line you'd enter the role you want and aws-vault will open the console UI in your browser logged into that role.
See how aws-okta
(a fork of aws-vault for Okta SSO) does that here: https://assets.contents.io/asset_sJsjOEX2.gif
The post where that gif appears is here: https://segment.com/blog/secure-access-to-100-aws-accounts/
Thanks for the explanation!
Interestingly, we are doing what you say to avoid, haha. We pass in eventCategory, eventLabel, eventAction in your track calls and align them with Segment's eComm Spec V2 . The main idea being that they've already mapped all of those events to GA and it allows us to keep those events consistent across of the integrations. For example, for "Product Added", we know that the same event is being used in GA, Facebook Ads, and AdWords and that the values/names are the same across all tools.
This is there example, but we follow almost all of it:
analytics.track('Product Viewed', { product_id: '507f1f77bcf86cd799439011', sku: 'G-32', category: 'Games', name: 'Monopoly: 3rd Edition', brand: 'Hasbro', variant: '200 pieces', price: 18.99, quantity: 1, coupon: 'MAYDEALS', currency: 'usd', position: 3, value: 18.99, url: 'https://www.example.com/product/path', image_url: 'https://www.example.com/product/path.jpg' });
To your point though, when we come up with custom events (things that Segment may not have defined) we need to create and hardcode a new Track call - we still follow Segment's format, but the Event name, e.g. 'User Viewed', would not be mapped to an existing GA enhanced ecommerce event, so in that case, it's just a custom event in GA as well.
As per your article, leveraging GTM to handle GA would definitely simplify that integration and make it much more easy to update. Good stuff.
A tracking plan is as you said, like a reference. For example: https://segment.com/academy/intro/how-to-create-a-tracking-plan/
Are you using spreadsheets for planning in your team? I'd love to hear your experience.
Unfortunately there is no such thing. The closer thing I can think of is what analytics tries to do with Universal User Id. In short, you give temporary ids until you know it is the same user, you can then alias these ids. There is a good read on segment blog.
I mean that my eyes weren't focused there. I had to search around for a few seconds to find it. The reason was there were too many distractions that peeled my attention away from the intro and call-to-action.
For example, the logo is the first thing that catches the attention. Because of the contrast between the white and the image and the fact that users naturally initially look at the top left. That's not where you want users looking at considering you have a very small time frame to make an impression and giving any value to the user.
That intro text also isn't crisp against the image with the drop shadow.
Segment.io is a great example of a properly funnelled landing page. Within the first second, I know exactly what they're doing with clear actions on the next process. The heading is quite large to attract the attention, secondary information is still there but in the background (like the logo and navigation) and after the user reads the text, you see that focused call-to-action.
So rule of thumb, is to have a single heading with one or at a maximum two call-to-actions (with one being a primary one) around it.
Absolutely crucial that you tag events on every step of your cart process so you can see where dropoff is occurring.
Segment also has a pretty good start list here: https://segment.com/docs/api/tracking/ecommerce/
Thankfully, to prevent #2 from really messing with people, folks can use do not disturb. Those who are well aware of the notification nightmare on iOS probably take full advantage of that. For those that don't, well... must not be bad enough? Those who really hate notifications? They'll instantly deny your request. Don't stress too much here.
You may also check out https://segment.com/ to help manage analytics. Everything routes to 1 place and you can flip other services on/off in order to distribute information.
Great work. You shipped something and are learning. Good luck w/ your next at bat. Ping me if you need to bounce a thought off of someone. I'm building www.xmcgraw.com to help folks in your position.