I'm fond of Renovate, which can be used to update dependencies, configs, and the like. I think it would slot in near Helm and Argocd in your ecosystem there.
Basically, your build system would update something, and Renovate would go change the configs in Github, and then Argocd would reconcile the new configs with the cluster.
I should have phrased that better. my uneducated guess is that the release note in this case was a copy/paste from commit messages created during development. possibly whomever did the commit was using an automated solution (e.g. whitesource) for version bumps, which provided an actual message to the user after updating their vehicle. if the tool is not consistently used, patch notes may not make it to the end user.
Large well known established open source software projects have had serious security problems too. Not just "I can crash your server then post about my bravery on reddit ", but "I can escalate privilege and pwn your shit" stuff.
First result on google for security bugs in open source: Top Open Source Security Vulnerabilities for 2016, featuring glibc, android, openssl, linux kernel, mysql and openjdk.
None of it is really newsworthy if we've been paying attention. We generally accept that shit happens, try to design our systems in layers to resist a single point of compromise, and hope we can race the bad guys to fix shit before they exploit it.
And maybe that bears repeating. Folks that malicious exploit a bug to disrupt other people's systems are the bad guys. Folks that make a point to notify those bad guys as early as possible in the interval of time from when a fix is published and when it widely distributed, are also the bad guys. (cue the cries of "Oh but there was no possible way to know those tweets were going to be instrumental in the commission of multiple felonies.")
This would once have been non-controversial, but some folks somewhere have misplaced their moral compass.
And this is precisely why we use tools like BlackDuck and Whitesource in large corporate environments where cargo-culting code from StackExchange and GitHub as well as hundreds of thousands of FOSS projects gets caught and ripped out.
It’s a major legal and compliance risk to include that in most applications unless you have free and clear authorization and a compatible license to use it in your project.
Oh man, I can just imagine making little cartoon characters for Bash, GNU and Kernel, something like this.
Assuming you're using an infrastructure-as-code approach, you can use something like https://www.whitesourcesoftware.com/free-developer-tools/renovate/
i.e. set up a Helmfile with your installations and check it in somewhere; Renovate will monitor that and make pull requests when a new version is available.
Relying on the mechanism of installing the latest versions as a security approach does not have much in common with security ;-) The latest versions are installed:
npm update
- so you rely on the fact that someone will actually do it (manually) from time to timeAnd if you do not actively develop, you won't update dependencies at all.
For security in terms of dependencies, valid solutions are:
Das halte ich für eine Trugschluss. Siege beispielsweise: https://www.whitesourcesoftware.com/resources/blog/3-reasons-why-open-source-is-safer-than-commercial-software/
Persönlich sehe ich vor Allem folgende Vorteile bei Open Source: Keine Abhängigkeit von Drittfirmen, die bankrott gehen und Software nicht mehr fixen können. Bugs können frühzeitig entdeckt und behoben werden, das funktioniert am Besten, wenn man viele Entwickler hat. Eine Firma hat u.U nur wenige MA, die wahrscheinlich auch andere Dinge zu erledigen haben…
Safety by obscurity ist keine gute Leitlinie!
There are good reasons why the Apache and MIT licenses lead with more than half of the total open source projects - https://www.whitesourcesoftware.com/resources/blog/open-source-licenses-trends-and-predictions/ with GPL trailing at 20% (10% for v2 and 10% for v3).
GPL and AGPL are not about freedom - they are about restrictions and ensuring a level playing field. True freedom (no conditions) is achieved through public domain and in jurisdictions that don't recognize the same - a permissive license with no attribution clause.
I think your example sort of proves my point: in your example, such a company is still using open source. Your prediction is basically that they would stop using gcc and replace it with a proprietary or homemade solution. But no, they keep using gcc, even if they spend time applying patches. I can only guess that this is a rational decision: even with this effort, using open source is still better value than alternatives. This is my point.
Also, people are still using openssl (I guess you mean this, not openssh), which was much more critical than log4j. The problem got fixed, and open source continues to grow in economic importance. IN particular, the openssl project saw a large increase in bug reports following heartbleed: rather than abandoning it, the users (companies mostly) worked on fixing it.
https://www.whitesourcesoftware.com/resources/blog/how-the-heartbleed-vulnerability-shaped-openssl/
It was an infinite recursion.
My assumption is that the CVSS score was calculated like this.
You will need to check the licenses of the projects -- you can usually find them in a LICENSE
file at the root of the project.
Generally speaking, open source licenses are split into two different categories:
You can build commercial software using both, but have to be careful about whether you use copyleft licenses. If you do, you may be legally obligated to open-source your work which might not always be the best idea from a business standpoint.
You can find a more detailed overview of open source licenses from websites like https://choosealicense.com/licenses/, https://opensource.org/licenses, and https://www.whitesourcesoftware.com/resources/blog/open-source-licenses-explained/
If you want to be absolutely safe, I'd stick with permissive licenses such as BSD, MIT, and Apache.
Well, it is just the tip of the iceberg. For the past month, I've been reporting malicious NPM packages and versions (due to ATOs) on a daily basis with over 350 in the last 30 days.
I tried to aggregate the data on this incident here (as someone suggested):
https://github.com/faisalman/ua-parser-js/issues/536#issuecomment-949936808
Here's a short writeup I did on that particular case: https://www.whitesourcesoftware.com/resources/blog/popular-javascript-library-ua-parser-js-compromised-via-account-takeover/
I'd ask the projects that you intend to consume the service what they could handle. I've heard that a straight GPL doesn't preclude it being offered as a SaaS for profit, and there are a number of projects that have changed the license because they've been hit with competing companies doing that, and then not putting up the changes they've made because selling a hosted service avoids the requirements that packaging it for download makes necessary. It seems like they move to an AGPL in those cases.
https://www.whitesourcesoftware.com/resources/blog/the-saas-loophole-in-gpl-open-source-licenses/
Zlib uses its own license which is permissive, so there are no pervasive worries there. You can choose most others you might like to use. The Apache 2.0 and MIT licenses are the most popular in open source.
I'm not sure about which license will ensure the product is free - the idea about most of these licenses is not that you can't make money selling code, but rather that the source has to be freely available. In fact, the ability to sell the software is part of the Open Source Definition.
I'm not too good at using docker, but I think you need to expose ports from the container, ports 8096 and 8920 so that they can be accessed outside the container. But I feel that is not the case as you can access it from LAN and all. Another thing is make sure "Remote IP address filter" list under Networking is properly entered, and the mode is set correctly. I would try to open F12 dev tools in the Web browser and see what errors are encountered.
Hope this helps
There are a lot of different rights for software, ranging from "you aren't even allowed to see the code"(copyright) to "do whatever you want with it"(copyleft)
You could grab a copyleft sourcecode and sell it as is. The only restriction of copyleft is that you can't include it in a project that's under a different licencing (this is the case of the GPL licence)
Other open source licences are more permisive and allow to be used on commercial proyects under certain conditions. For instance LGPL allows you to use it in commercial products as long as the code which originally was under LGPL remains under LGPL. The rest can be whatever licence you want.
Licencing is not a task a simple programmer can understand completely. When there's a tool you want to use you need to find out what's it's licence and then google to find out if that licence is compatible the one you intend to use.
This site offers some general description of licences. But keep in mind that licences can change, and they can be customized so whatever you read here is a mere guide. If you project is big you should consult with a lawyer
https://www.whitesourcesoftware.com/resources/blog/open-source-licenses-explained/
Funny names in science are kind of a thing. Sonic the Hedgehog gene is my favorite example.
This happens regularly in computer science as well. GNU = GNU's Not Unix is pretty old.
A lot of famous scientists are creative and have a sense of playfulness.
> And what does the kernel maintenance team would do? They're not security experts
They kind of are. Most of them are employed by large companies to maintain the kernel.
Security audits miss a lot. Microsoft has an internal audit team, yet Windows has a ton of security vulnerabilities that get reported.
The main difference between proprietary code and open source code is that anyone is free to audit (and fix) open source code. Big companies with internal security teams do just that for critical code, and the big takeaways from Heartbleed were were audits and better funding. Companies didn't switch to proprietary code, they instead fund the projects they use better.
> OSS = security
OSS isn't secure by nature, it's merely open so anyone can prove that it's secure. A company's claim to have done a security audit is only as good as the reputation of the security company doing the audit, whereas an audit of an open source project can be reviewed by the community at large. That's what happened with openSSL.
https://www.whitesourcesoftware.com/free-developer-tools/bolt-for-azure-vs-full-solution/ scans your dependencies and check for known vulnerabilities or possible licensing issues. We have it integrated in our build pipelines in Azure DevOps.
Expose your container ports to your linux server. So if you are using grafana container, expose container port 3000 and it will be available at 10.0.0.x:3000, or whatever IP your linux server is on. Never connect to the 172.x.x.x. address. Just expose your ports to the linux server. Use Docker-Compose to spin up your containers and use the ports line to expose.
https://www.whitesourcesoftware.com/free-developer-tools/blog/docker-expose-port/
The policy is to use WhiteSource which flags possible vulnerabilities. It runs as a build pipeline step and it is managed by the CISO team (not the devs!). If we have a true positive we evaluate it, but most likely we don't approve it.
The title is pure clickbait, the article nowhere mentions which these "3 most lest secure programming languages" are.
It simply quotes this report (which they forgot to link): https://www.whitesourcesoftware.com/open-source-vulnerability-management-report
The report was written by a company that sells vulnerability detection software, and naturally they want to make their product look as very necessary.
​
The Techrepublic article does point out that the high amount of vulnerabilities found by the report among programs written in C has to do with the high amount of programs written in C, but doesn't attempt to calculate a ratio like vulnerabilities by line count or similar, so it doesn't really add much to a report that must itself be read with a pinch of skepticism.