As someone who just moved into the Splunk Consulting world, here's some things to know:
Most companies outsource their work to Splunk Professional Services firms. These are the guys you'll want to work for.
Working for a Splunk PS partner requires a Splunk Cert.
Splunk certs are not cheap - best to use someone else's money if you can.
Getting a cert can take three months, during which you won't necessarily be working, so have some runway.
If you go for an FTE job that requires Splunk skills, it will likely also require other Admin skills, including other Analytics tooling, and Linux administration.
Hey brok3nwir3,
I’m a Splunk certified architect who just got the recert at .conf this year. It wasn’t until after the test that I found the “blueprints” that Splunk publishes to tell you what to study.
The master list of all blueprints
The specific Core Certified User PDF
The test format is typical Pearson: some multiple choice, some selecting all that apply, etc. The biggest takeaway I can give on the format is to use the flagging feature. Get all the answers you know, then double back to look at the items you’ve flagged as being unsure about.
Comparing your notes to the PDF, most items, about 75%, would be covered at some level by your prep. Try to know the specific commands they mention, the search UI and flow, and hopefully we’ll see you on the other side with a shiny new Cert!
Let me know if you have any other questions, and best of luck
To oversimplify: Search is designed to contact all instances, rather than locking a specific user to a set of instances, so while you could have multiple cloud instances, your Splunk Enterprise search head will still contact all of them when a user runs a search.
The right way to do multi-tennant is through role-based access, limiting users to specific indexes. You can use index-time transforms to route data into specific indexes for each client.
Keep in mind you'll also need an MSP license to run Splunk as an MSSP, which is different to a normal license (https://www.splunk.com/en_us/partners/become-a-partner/managed-service-provider-program.html)
https://www.splunk.com/en_us/training/courses/splunk-fundamentals-1.html
Start with that free course. Part of the work is setting up your own Splunk server, which will get you started.
You'll need to forward logs to the Splunk system via syslog or install a Splunk Forwarder on any systems you want to gather logs from.
Good luck!
There is a bank of questions that are made internally here at Splunk. Each question has a difficulty level from various sections of the exam.
That said, you'll get some questions you encountered before and other questions from the bank.
I don't think there is a method of only showing questions you got right, wrong or new - so completely random, but to keep the difficulty level the same for each section.
Hopefully that makes sense.
For Power User, I can say, study the hell out of the material and practice on a test Splunk system. We provide free 50GB/day Dev/Test Licenses here: https://www.splunk.com/en_us/resources/personalized-dev-test-licenses.html
What? There's barely anything on the admin exam that requires more than a cursory understanding of regex. OP do not obsess over regex, double-check your understanding of all the concepts that are listed in the blueprint for certified admin. Regex is a magical world that you could spend all your prep time studying and still fail the cert exam.
You don't want UBA. Instead, look at the machine learning toolkit and ITSI.
Depending on your situation, you might get a lot of value from an internship. Understanding Splunk is one thing but also understanding its use cases and demands in cyber security, IoT and other types of machine data is a huge part people often overlook. If you have zero experience in IT or Cyber Security I'd consider pairing your Splunk certification with another networking or security related certification.
Any entry level Splunk or AWS certification will likely get you a intern or jr. architect role in a lot of places. Consider the two lists below:
https://aws.amazon.com/certification/
Becoming a Certified Splunk User/Admin or AWS Cloud Practitioner are great first steps and are fairly cheap to attain. However, the further you down the certification path for Splunk the tests and training get considerably more expensive, it's best to find a company that will pay for you to advance your knowledge and pay for new certifications, ask about that in the interviews.
If you have other questions just let me know.
Splunk Fundamentals 1, my friend. It’s a video course (plus cert if you’re into that) that covers what Splunk is and teaches you all the basics.
https://www.splunk.com/en_us/training/courses/splunk-fundamentals-1.html
TCPDump or wireshark may be good, but if you want to look at the Splunk built solution for packet capture, Splunk Stream is a pretty good option Blog about setup
Some thoughts as a longtime Splunker:
> Most companies outsource their work to Splunk Professional Services firms
I would say "many", not most.
>These are the guys you'll want to work for.
Very good suggestion based on a few folks I've talked with who ~~couldn't~~ didn't get a job at Splunk proper. [Edit: that sounded bad.]
>Splunk certs are not cheap
But Splunk fundamentals 1 is free and is better than nothing. Also, splunk4good is soon to get the second fundamentals class added to their offering. So, anybody with an .edu email address can go that route. Or military, which we announced at .conf.
use the MC to check when searches are being ran. lots of times people schedule everything to run at the top of the hour instead of spreading them out
look into search skewing which could help with a bunch of searches being ran at once (https://www.splunk.com/en_us/blog/platform/schedule-windows-vs-skewing.html)
check how many searches are being ran at once. you should have one CPU core per search. might have to add more CPU cores
looks like all the skipped searches are data model acceleration. make sure you're restricting your data models to only the indexes with the data relevant to that data model
If you have an existing ticketing system you can usually poll the api. I do this with Jira.
Alternatively this is basically what ES does. It has correlations, alerts, incident tracking, and open, in progress, closed datasets. Just gotta pay for it. :)
https://www.splunk.com/en_us/software/enterprise-security.html
Certification tracks are on the left: https://www.splunk.com/en_us/training.html. These certs look wonderful on a resume. I'm only a certified Power User but I get pinged by recruiters left and right for Splunk jobs all over the country.
If you are up for some deep reading, check out the Splunk Validated Architecture, which is a <50 pg. doc that among other things, has a dedicated chapter on data collection techniques. Here’s the intro blog post: https://www.splunk.com/en_us/blog/tips-and-tricks/splunk-validated-architectures.html
Read through the exam blueprint and study the PDFs. The sections in the test correspond to the chapters in the PDF.
https://www.splunk.com/pdfs/training/Splunk-Test-Blueprint-Power-User-v.1.1.pdf
Hello. The sys admin and data admin classes are recommended but are not required. (In case anyone writes in with old info, the requirements were dropped last year. See here).
What is required is the Power User Cert.
As you list, your study material should be the Blueprint, and then look up each topic.
Good luck!
There are lots of repositories out there. You're actually most looking for KML / KMZ files (which Splunk uses natively for geospatial lookups).
See also this old blog post: https://www.splunk.com/blog/2015/10/01/use-custom-polygons-in-your-choropleth-maps.html
Also this doc: https://docs.splunk.com/Documentation/Splunk/7.2.5/Knowledge/Configuregeospatiallookups
When I took Power User, all the answers were in the guide. Not much in my opinion I could do else where. Maybe have the Splunk document downloaded for the search reference and the quick reference. Good luck.
https://www.splunk.com/pdfs/solution-guides/splunk-quick-reference-guide.pdf
But more than anything else. Use the included document. All answers are in that. Good luck m8.
There are download pages for older releases of both splunk core and forwarders:
https://www.splunk.com/en_us/download/previous-releases.html
https://www.splunk.com/en_us/download/previous-releases/universalforwarder.html
You'll be fine, just study the slide deck and this study guide! I passed with the same content but a lot less hands on work. https://www.splunk.com/pdfs/training/Splunk-Certification-Exams-Study-Guide.pdf
If you want to use a load balancer in front of your indexers, you can do that for HEC (HTTP Event Collector) data collection. If you deploy on AWS, ELB (aka Classic Load Balancers) are preferred over the ALB’s.
Check out the Splunk Validated Architectures for more guidance : https://www.splunk.com/pdfs/technical-briefs/splunk-validated-architectures.pdf
Cert doesn't handle Splunk University, but we will have a pop up testing center at .conf again this year.
​
You should be able to find the answers to all your cert/education questions here: https://www.splunk.com/en_us/training.html
Taking fundamentals I and II gives you power user. Too my knowledge there is no power user exam itself.
https://www.splunk.com/en_us/training/certification-track/splunk-core-certified-power-user.html
Read through the test blueprint, it will tell you where the questions come from in the PDF. The sections listed in the blueprint correspond to the Chapters in the PDF. I've done well in the tests by reading through the PDF and focusing more on the sections with the higher weight and the sections that came me the most "trouble".
https://www.splunk.com/pdfs/training/Splunk-Test-Blueprint-Power-User-v.1.1.pdf
Splunk Core Certified Power User is a mandatory prerequisite to Splunk Enterprise Certified Admin.
All candidates must complete the Power User exam before proceeding.
Splunk Core Certified User is not a mandatory prerequisite to Splunk Enterprise Certified Admin.
Source: https://www.splunk.com/en_us/training/certification-track/splunk-enterprise-certified-admin.html
> Anyone that uses it or have taken a Splunk Cert can recommend which one is better for maybe a SOC analyst Or Cyber Analyst role?
Your question was specifically for an analyst perspective, that's not admin scope certificate territory.
The below is from Splunk's page. https://www.splunk.com/en_us/training/certification-track/splunk-es-certified-admin/overview.html
> A Splunk Certified Enterprise Security Admin manages a Splunk Enterprise Security environment, including ES event processing and normalization, deployment requirements, technology add-ons, settings, risk analysis settings, threat intelligence and protocol intelligence configuration, and customizations. This certification demonstrates an individual's ability to install, configure, and manage a Splunk Enterprise Security deployment.
Look at all the prereqs Splunk suggests someone attempting the certification: all are administration. None of which is for a day to day user.
Sure if you want to expand knowledge of the internal workings of ES and admin role types of things knock your socks off, but since you said to only go for one, I'd focus on Power User. This is one area I think Splunk sucks (there are others) at with ES, there's no user course for ES which I think it needs to due to complexity and the internal workings of which can be confusing to an average user.
https://www.splunk.com/en_us/resources/splunk-enterprise-metered-license-enforcement-faq.html
> Starting with version 6.5, Splunk Enterprise will no longer disable search when you exceed your licensed data ingestion quota. This will be standard for any new license purchased on or after September 27, 2016. If you’re an existing customer, you will need to upgrade to Splunk Enterprise 6.5 and request a “no-enforcement” key from your Splunk sales rep or Splunk authorized partner.
Short answer is "yes". If you can configure TMDS to send the data to that location and port, Splunk can be configured to listen and ingest the logs.
Long answer is "Yes, but don't do that because it's a bad idea." If you configure Splunk to listen for incoming logs on port 514, then every time you deploy a new bundle or need to restart Splunk for any reason, you will drop logs.
The right answer is to stand up a syslog server to receive those logs and then forward them into Splunk for indexing and analysis.
If you're starting from scratch, check this out: https://splunkbase.splunk.com/app/4740/
EDIT: Please also review Splunk Validated Architecture here: https://www.splunk.com/pdfs/technical-briefs/splunk-validated-architectures.pdf
Start near the bottom of page 31 to dive into syslog architecture.
Splunk's been around for at least 15 years and is currently on release 7.3. And has many major companies as customers. This is only a list of customers that are public, there's a much much larger list of companies that don't announce, though if you look at Splunk engineer openings at big companies, you can get a hint of the bigger scope of market penetration. Definitely not a passing fad.
Two avenues, I recommend doing both!
https://www.splunk.com/page/road_map_vote contact the product team and participate
Submit an enhancement request through the support portal.
Squeaky wheel gets the grease and all.
https://www.splunk.com/en_us/training/courses/splunk-fundamentals-2.html
Download the course overview, then go to the Splunk docs to dig into every item. Go two to three layers deep. Be sure to understand all the different ways to use the commands listed.
If they meet the criteria here, then send me a PM with your email address and I'll loop you in with the program manager.
> We’ve got good news and exciting news. The good news is you are current and will be eligible for a one-year recertification window under the new program. The exciting news is that this new certification is bigger and better than ever. Splunk Enterprise Certified Architect is an all-encompassing Architect certification, meaning it includes Cluster Administration and Troubleshooting Splunk Enterprise as part of its prerequisite coursework. From October 1, 2018 to October 1, 2019, you will be eligible to register for the Splunk Enterprise Certified Architect exam even if you haven't completed these two courses. Be sure to act within this recertification window. Candidates who do not pass the Splunk Enterprise Certified Architect exam by October 1, 2019, will be subject to the full certification path including these courses and all prerequisite exams.
It is a bit of a unicorn! In case you aren’t aware, the HF requirement for many TA’s is due to the presence in that package of Python as an app platform component, and not due in any way to the features of a HF in itself. In fact, HFs should be used vert sparingly, as explained in part in this blog post: https://www.splunk.com/blog/2016/12/12/universal-or-heavy-that-is-the-question.html
We used to support a lightweight forwarder package which didn’t do the heavy bit of parsing and sending up cooked data to the indexers, and it had Python, but that was deprecated a while ago in order to reduce the support, test, and release engineering burden. We might still ship it, but I haven’t looked in a while. But don’t use it, the UF is a better choice most of the time.
If you're looking for a nice set of data to work with for query and dashboard practise for security you can use the Security Datasets Project.
(I work for Splunk)
> A Splunk indexer requires 12 vCPU and the AWS instance types are 8 or 16vCPU so you will either be underpowered or overpaying.
This completely depends on volume of data ingestion and search usage. 12 vCPU and 12 GB RAM is reference hardware which can support 1.7 TB/day if no searching or other activity occurs. Also bear in mind that these are VMs, so they're shared resources underneath.
Check out these two docs from Splunk on deploying in AWS: https://www.splunk.com/pdfs/white-papers/splunk-enterprise-on-aws-deployment-guidelines.pdf
Seen this best practices tech brief PDF yet?
Or this just published AWS QuickStart?
How much data do you plan on ingesting each day? What's your setup look like? How many concurrent searches do you expect?
Assuming you're having a small clustered environment, you will need 3 indexers, search head, deployment server, license master, cluster master, and optionally a heavy forwarder..
Your indexers should be atleast 12 vCPU while the search head should be atleast 16 vCPU.. You should use c4 instance types.. For your indexers you should use c4.2xlarge or c4.4xlarge.. Bigger is always better, if your ingesting >300GB per day then I'd go with the 4xlarge. You should use at a minimum of a c4.4xlarge for your search head.. You could also combine the DS, license master, cluster master, and HF on a single c4.xlarge instance..
https://aws.amazon.com/ec2/instance-types/
EBS is standard storage for c4 instances.. Splunk recommends EBS storage because they are optimized to maximize the IOPs.. Indexers are all about the IOPs
Also remember that reserved instances are cheaper than on-demand instances..
Here is the certification handbook which will answer all your questions. Short answer yes you need to have power user cert first, no courses are required.
https://www.splunk.com/pdfs/training/Splunk-Certification-Candidate-Handbook.pdf
Hi mistermattymo! thanks for your reply
I do not have any crash log, no.
This is a single instance deployment, running in a fairly conservative box, an m4.large (2 vCPUwith 8GiB RAM and 450 Mbps for EBS bandwidth) for 5GB daily ingestion.
That OOM comment you did made me recheck the messages log and I believe you just hit the problem in the head. I found these lines:
lowmem_reserve []: 0 0 0 0 kernel: 1421 total pagecache pages kernel: 0 pages in swap cache kernel: Swap cache stats: add 0, delete 0, find 0/0 kernel: Free swap = 0kB kernel: Total swap = 0kB kernel: 0 pages HighMem/MovableOnly Out of memory: Kill process 3317 (splunkd) score 359 or sacrifice child kernel: oom_reaper: reaped process 3317 (splunkd), now anon-rss:0kB, file-rss:0kB, shmem-rss:0kB
plus, I had a user tell me he was running some intense searches when he noticed it went down.
I guess my instance is just not beefy enough for the traffic? Too bad that the kernel decides to kill the process, that's not good.
I'll try to search for specifics on choosing when/how to upgrade the server specs, and start considering maybe a distributed indexing. If you have any useful resources on that send them my way!
I'll definitely join the Slack, could be a cool place to learn how others are dealing with issues.
thanks man, you helped a lot
I should have started by asking why are you after this information or what you plan to do with it?
The account team and the appropriate stakeholders on your side would be in a position to provide specific information on the quote.
Splunk does publish list pricing on the website. (https://www.splunk.com/en_us/view/pricing/SP-CAAADFV)
The Splunk Core Certified User certification isn't a prerequisite for the Power User exam currently: https://www.splunk.com/en_us/training/certification-track/splunk-core-certified-power-user/overview.html
Generally I’d recommend following the ES End User learning path, found here: https://www.splunk.com/en_us/training/free-courses/overview.html
Of the courses you have available the Power User cert matches the best, since it aligns with the Fundamentals 1/2/3 courses earlier in that learning path.
I see you’ve mentioned a preference toward the ES admin exam, but be aware that it’s fairly tightly focussed on managing ES, rather than actually using the tool. I wouldn’t really bother if you’re looking for an analyst role.
You won't be able to skip to the exam, as one of the pre-reqs is that you have taken the courses: https://www.splunk.com/en_us/training/certification-track/splunk-enterprise-certified-admin.html
Sorry to be the bearer of bad news!
You should be able to find everything you need here :) https://www.splunk.com/pdfs/training/Splunk-Certification-Candidate-Handbook.pdf
you mean the one posted to their website? https://www.splunk.com/en_us/training/certification-track/splunk-core-certified-advanced-power-user.html
I would at least take the free courses:
https://www.splunk.com/en_us/training/free-courses/overview.html
Someone also posted a link to videos a few days ago:
https://www.reddit.com/r/Splunk/comments/jvuud7/30videos_10_hrs_course_content_a_complete/
That being said, I would treat Splunk like any other enterprise software you may have in your org. The same way people pay for network training or DB certifications, I would ask my employer to pay for Splunk training.
Your primarily learning SPL, so 500MB indexing a day, or importing "free" data (like BOTS), should be enough.
https://splunkbase.splunk.com/app/3353/
https://www.splunk.com/en_us/form/discover-the-power-of-spl.html
Go to the site here:
Splunk Core Certified Power User Exam
Get the Exam Study Guide. From there, download the Test Blueprint. That will have every competency covered in the exam.
If you're not already running Splunk Enterprise at home, you should be. Single instance install is super easy and will allow you to dig into everything you will need to know for User and Power User.
Good luck!
Not sure how it's possible that your colleagues got the Certified Core Consultant cert without completing the prerequisite courses and exams. Are you sure they didn't have the old consultant cert and just took the exam to update it to the new cert?
You can't even attend the Core Implementation course if you haven't completed the prerequisite courses.
https://www.splunk.com/en_us/training/certification-track/splunk-core-certified-consultant.html
I believe you might be looking for something similar to what’s under Capture values from multivalue fields in the Splunk Docs. I could not link to that sub header unfortunately.
https://docs.splunk.com/Documentation/Splunk/8.0.6/Viz/DrilldownLinkToURL
I am assuming the field that contains the urls is multi value meaning that when you give over one of the URLs, only it only highlights and not both.
Edit: If you have an active entitlement for OnDemand Services and have available credits, you can submit a support case using the service catalog item for building a dashboard. https://www.splunk.com/pdfs/legal/splunk-on-demand-services-catalog.pdf
Works well and is easy to setup for storing your own and you are only paying for S3.
There is also https://www.splunk.com/en_us/blog/cloud/dynamic-data-data-retention-options-in-splunk-cloud.html which is from what I've heard similar to the above but Splunk managed. I haven't run into anyone using it, I've heard it's a bit pricey.
Splunk is a structured data collector/aggregator - the end goal of which is to be able to search all of that data for "useful" things
Splunk certifications have value (and I'm not just saying that as someone who's been through a lot of the classes): but only if you're going to use them
Being a certified Splunk Architect, for example, means bupkis if you're not working with data (log files, http streams, netflow, IoT, etc)
You are required to take the classes in order - see the training handbook: https://www.splunk.com/pdfs/training/Splunk-Certification-Candidate-Handbook.pdf
For reference, this is the doc OP is referring to: https://www.splunk.com/pdfs/white-papers/splunk-enterprise-on-aws-deployment-guidelines.pdf
Are you looking to learn it or get certified?
For learning Splunk the 3rd party training "might" be ok.
For certifications Splunk has prerequisites for attending their official training.
https://www.splunk.com/en_us/training/program-guide.html
I have been working with Splunk in an advanced role for many years. Do I need to complete the prerequisite coursework?
Yes. For specific questions or 1:1 guidance regarding your particular path to certification, please contact us directly at .
Ok, we have 6 public certs and a couple of accreditations for partners (as you know) that I would also want on there. Looks like this page (https://www.splunk.com/en_us/training.html#certificationtrack) has some nice icons...
​
This is looking like a bit of work to setup so it flows well, so won't happen today. But it's on my todo. Meanwhile, I can certainly fez you up. :)
What Splunk Certs do you have already?
For what you're doing today, you should have at least Power User. For what you want to do later, Splunk Admin would be required... or at least get a higher paying jerb.
Splunk Architect cert will net you the most for pay: https://www.splunk.com/en_us/training/certification-track/splunk-enterprise-certified-architect.html
Personally, I had Power User before I moved from Federal Contractor to Splunk officially. My LinkedIn was full of job offers for $110k to $140k for having a Power User cert in my area, DC Metro. After getting Splunk Architect at Splunk, I consistently get offers for $180k to $190k. Again, DC Metro. We have the close to the highest average salary on the East Coast and naturally cost of living.
Have some questions for your interviewer. Think about what you want to get out of the role.
In addition to the free download, if you have the time you could do the Free Splunk Fundamentals 1.
https://www.splunk.com/pdfs/solution-guides/splunk-quick-reference-guide.pdf
I like to give this to newbies to Splunk :-)
Hey!
​
You are right - the certification process has changed drastically but I do think it is for good measure, even if there have been some teething problems.
Honestly for Architect you will just need to lean on real world experience, so my best suggestion is to get some VM's or dev tin etc and attempt to build a distributed splunk instance, rinse and repeat until you can do it without any additional assistance from docs etc.
The Architect cert is quite a step up from Admin so just be prepared for some hands on testing under timed conditions, closed book.
​
If you have a look at the Splunk Architect Exam Blueprint: https://www.splunk.com/pdfs/training/Splunk-Test-Blueprint-Architect-v.1.1.pdf it should help you out with what areas to focus on for revision, or what areas it will cover.
​
Hope this helps & good luck!
​
​
>How should I prepare for power use exam?
The weekend before, take a day and review all the topics of the Fundamentals I and II courses (which the Power User exam covers). Work with/build these things in your test environment.
For example, one of the topics of Fundamentals 2 is macros. Do you know how to build one? Use one? Pass parameters to it? Go into your test instance, build one out, use it in a search, make sure you understand it.
Review your notes on how to do these things or Google around if you get stuck. The important thing is to get some hands-on practice with these things.
​
>Is Fundamental 2 going to be enough?
Review the material from Fundamentals 1 and 2. That will be sufficient for this test.
​
>Is there anyway I can get some sample questions?
Not that I know of. But if you practice working with these things in your test instance, you will be set.
​
>If I fail and retake the power user exam, are the questions going to be the same for the retake?
Questions will be changed, but they will be similar the second time around. if you do fail, just take note of the problem areas and review them again in your test environment.
​
Have a look at what is expected from the cert, look at the course objectives to each class, and you will know what to study and practice: https://www.splunk.com/en_us/training/certification-track/splunk-core-certified-power-user/overview.html
​
Best of luck OP, you'll be great.
​
Late to the party but this might be useful for you:
You can register for a free Splunk provided class that will help you get introduced to Splunk and show you the ropes - https://www.splunk.com/en_us/training/courses/splunk-fundamentals-1.html
You can even get certified as Splunk User (also free).
I've implemented anomaly detection and prediction and also working with integrating the MLTK into ITSI.
The Machine Learning ToolKit is a good start but I never got accurate results using the anomaly detection module. So I decided to build it in core SPL instead and use the downsampled line chart viz from the MLTK to help with the visualizations.
You should first define your objectives and document your current state and create a project plan with goals. Based off those goals, you should then determine what data needs to be used and if its currently logging in Splunk. If yes, then you need to see how your trends look, if your dealing with cyclic type data then this will be much easier to establish a pattern and predict future values. You should also see if your data is normally distributed by binning your field and see how how far the outliers are away from the mean. This will help determine is linear regression is right for your use case.
Below is a great link to get started
https://www.splunk.com/blog/2018/01/19/cyclical-statistical-forecasts-and-anomalies-part-1.html
Feel free to PM me or reach out on LinkedIn if you have any questions!
Go to this link, create a splunk education account. Once you are signed in, add "Splunk Tutorial 6.x (eLearning)" to your cart - you have access for 30 days.
They give you links to all the course materials and demo data, you just install Splunk on your PC and go!
Nope, if you fail the first time, you can re-take the exam 5 days later. If you fail the second attempt you have to wait 30 days, fail a third time you wait 60 days. More details are here: https://www.splunk.com/view/SP-CAAAP2W If you already registered for the exam and got the email with your specialized link, I don't know how long this link is valid, but I am pretty sure it will be okay for a while (30 days maybe? I don't know).
I'd install Splunk Stream forwarder on any (or several) NTP clients and watch for that traffic and report up on transaction times.
https://www.splunk.com/en_us/products/splunk-stream.html https://docs.splunk.com/Documentation/StreamApp/7.1.1/DeployStreamApp/ProtocolDetection
Stream is passive, so this wouldn't be a health check. Instead, you'll only be able to observe the UDP traffic when it does occur. I don't recall ntpcloent specifics to know how often that's done, but I bet it's predictable.
Looks like NTP is on the detect-only list, which means you can't extract fields from a conversation--instead you just know when it happens. But you'd see the request go one way, then back.
Elastic has some time based log analysis tools (logstash), I have not personally used it but have see a demo. It just depends on what problem you are trying to solve which took to pick.
Reference: https://www.elastic.co/products/logstash
Also want to plug Cliff Stoll. Read this to understand how InfoSec got started, and to understand the kinds of events you're trying to capture and visualize.
a quick google search found this...splunk-&-ransomware
hope this helps.
btw...whats this usecase for, work or school?
To add to what halr9000 posted.. this is a helpful overview and breakdown of options.
https://www.splunk.com/en_us/blog/platform/a-new-way-to-look-like-splunk.html
As with anything, the subject is more complicated than just being able to edit xml.
Here’s a good intro to load averages: https://www.site24x7.com/blog/load-average-what-is-it-and-whats-the-best-load-average-for-your-linux-servers
There’s a few things to keep in mind and other things you can do to maximise performance on your box.
1: disable THP if it’s enabled. You want to do this at boot time, like so: https://www.thegeekdiary.com/centos-rhel-7-how-to-disable-transparent-huge-pages-thp/
2: try to reserve your cores if at all possible.
3: hyperthreaded cores are almost useless for splunk (about 10% bonus), so if you’re using intel CPUs you could be running on as little as ~6.6 cores. I’d definitely try to up your core count. Your load average spiking to 30 agrees that you’re out of CPU.
4: IO consumes CPU. Any time you want to write to disk, the CPU needs to handle the IO, so if you’re waiting for slow disk then the CPU is tied up and can’t process data at the same time. This doesn’t appear as CPU usage, but as CPU wait instead. You can use iostat -x
to see detailed io information. Run it a few times, waiting 10 seconds between runs - it accumulates info since it was last run and shows the average over that time period, so the first time you run it the output will be from “all time” - I.e ignore the first output.
5: pull all the strings you have to get some fast disk for your hot/warm storage. I suspect you’re having problems because Splunk simply can’t write to the disk quickly enough.
Dashboard Studio is what you need to use.
https://www.splunk.com/en_us/blog/platform/dashboards-ga-introducing-splunk-dashboard-studio.html
Map is a background image and just dump a bunch of objects on top.
Yea look at the first few characters of each raw event without the table. You'll notice it's sorted all the a's then all the b's and all the c's etc. https://www.splunk.com/en_us/blog/tips-and-tricks/order-up-custom-sort-orders.html
https://www.splunk.com/en_us/download/splunk-enterprise.html
​
You'll need to create a free Splunk account in order to download. The signup info is on the right hand side of the above link.
for splunk in a lab youll probably want splunk enterprise (not splunk cloud). go to the below link. youll have to create a splunk account before downloading.
https://www.splunk.com/en_us/download/splunk-enterprise.html
youll also need to download and install a universal forwarder on the machine that will be sending log to splunk.
There are several admin courses out there, but they do tend to be quite pricey. If you can't get them payed by your employer and don't want to pay the money out of pocket I'd recommend checking the exam blueprint for the Splunk Admin certification and then use the Splunk docs to play around with all the topics in your own Splunk environment. (There is a free Splunk version that you can set up at home)
Best of luck.
Instead of paying for another tool, why not use ingest actions? Its free and built by Splunk?
The contents reappearing is a known item. From the blog: "If any jar files return in the splunk_archiver app, disabling the default Bucket Copy Trigger search in that app will stop this behavior from happening. "
This page will be a one-stop page for people to start leveraging Splunk to detect and defend against Log4Shell vulnerability.
Just updating here... there's been this blog and a few others:
Will update as we go. Good news is core Splunk Enterprise has been patched and can be downloaded.
> To take the test for certification you need to take the Splunk specific trainings, though they are pricy.
This is not true. OP, read about it yourself from Splunk rather than trusting the internet. The quick links here have everything you need to know: https://www.splunk.com/en_us/training.html
If you're highly motivated, you can get up to Splunk Certified Admin without paying Splunk anything besides the price of the exam. I'm not saying I recommend it, but you can.
You can process the events fully with the lambda function. This will allow you to do whatever you wish. Refer to this for general guidance outside of masking https://www.splunk.com/en_us/blog/tips-and-tricks/how-to-ingest-any-log-from-aws-cloudwatch-logs-via-firehose.html
Link to the test Blueprint is below. Reading through the PDFs that came with the courses is very important. The sections of the test should match up with a chapter/section of the PDFs.
https://www.splunk.com/pdfs/training/Splunk-Test-Blueprint-Admin-v.1.1.pdf
The classes have never been the same thing as certification. Fundamentals 1 was free, that's true, but the User level certification has not been free for at least 2 years, ever since the certification process became proctored.
Each certification has a $125 cost per attempt, or you can purchase 5 vouchers for $500.
It's important not to confuse or mix together the classes you can take and the certifications you can get. This link is pretty recent, looks like it was updated on Nov 2:
https://www.splunk.com/pdfs/training/Splunk-Certification-Candidate-Handbook.pdf
I had plans to do the same thing. My plan was to use OpenHardwareMonitor (https://openhardwaremonitor.org/screenshots/). It has an option for writing out all of the sensors via CSV.
Real through the Enterprise Security Exam Blueprint
For any topics you don't know very well, go through the class slides and documentation. Learn those topics.
Also, ES takes experience. Use the product so you know what actions take place on what screens.
If you look at the roadmaps surrounding the certified Cloud admin and certified Enterprise admin certifications as they go through all aspects to maintaining and managing the core offerings
https://www.splunk.com/en_us/training/certification-track/splunk-enterprise-certified-admin.html
https://www.splunk.com/en_us/training/certification-track/splunk-cloud-certified-admin/overview.html
So local is searching that standalone, and remote is searching others. If you're standalone, the best way to improve performance is faster disk. You can add CPU cores but that will only help if CPU is already maxing out.
The more complex solution is to seperate search and indexing by adding dedicated indexers with dedicated storage as you can scale near linear.
It honestly sounds like you need to resize your Splunk based on search users and ingest volume. See https://www.splunk.com/pdfs/technical-briefs/splunk-validated-architectures.pdf
This assumes you have many users and over 200GB ingest. If not then your disk is probably just too slow.
Start with the exam blueprints (a master list is here: https://www.splunk.com/pdfs/training/Splunk-Certification-Exams-Study-Guide.pdf) - the one for Advanced Power User is https://www.splunk.com/pdfs/training/Splunk-Test-Blueprint-Advanced-Power-User-v.1.1.pdf
Then review what's in your course materials
Yeah, so the recert policy outlines the 3 options. The 3rd option is taking a course to "refresh" the cert. So instead of having to take those 2 tests again I can take Transitioning to Splunk Cloud (which I had already planned on taking) and something like Fundamentals III to refresh my certs.
https://www.splunk.com/pdfs/training/Splunk-Recertification-Policy.pdf
If you go to your dashboard within splunk.com you should still be able to get a 7 day trial of ES Sandbox environment complete with dummy data under Free Trials and Downloads.
​
There are definitely some questions on the exam that rely on knowing how to use the CLI, and what various options do. Explicitly, 7.0 on the blueprint is knowing how to "Add an input to UF using CLI."
https://www.splunk.com/pdfs/training/Splunk-Test-Blueprint-Admin-v.1.1.pdf
The admin cert blueprint is linked. This will show you the breakdown by chapter and what percentage each makes up for the total score.
It's been a few years since I took the admin cert so I don't remember much of what was on the test but I do know there were a few CLI questions. It is multiple choice, which helps. I don't think you need to have every CLI command memorized but having an understanding of the commands should be enough to find the correct answer.
https://www.splunk.com/pdfs/training/Splunk-Test-Blueprint-Admin-v.1.1.pdf
Here is what I read about the new ITSI Content Pack for MS Exchange. Based on this, appears taking the ‘premium apps’ and making more feature / function available via the free with ITSI Content Library (Cloud) and available through Splunk Docs / Splunkbase (On-Prem). In short, is a win-win, and simplifies implementation....while reducing time to value. Don’t jump off the ledge...certainly have a drink, it’s Friday!
https://www.splunk.com/en_us/blog/it/getting-the-most-out-of-microsoft-exchange-and-splunk-itsi.html
I could have typed out thisone and several others but its easier to simply paste a link. :). https://www.splunk.com/en_us/customers/success-stories/compassion-international.html
this is use case goes beyond IT into the business and across multiple branches of the organization
I also did the 7.x Fund 1 & 2, and I felt that it was enough to pass the Power User exam.
Just go through you Fund 2 material and review the topics in this blueprint, and you should do fine.
Start here: https://docs.splunk.com/Documentation/ES/6.4.1/Admin/Addlocalthreatintel
And depending on how new you are, this may also be helpful to walk through where to go from there: https://www.splunk.com/en_us/blog/security/threat-intel-and-splunk-enterprise-security-part-2-adding-local-intel-to-enterprise-security.html