I may be wrong but the way you speak make me think you come from a more rigid OOP language.
Welcome to the jungle.
Depending if your developing an API, a server serving webpages, using or not SSR, building a desktop app, or a CLI app... What will make your life more easy won't be the same.
​
In today development, usually we don't group file by concerns (like a controllers folder or a single folders), but more by functionality, you can take a look at this example for inspiration : https://dev.to/lars124/how-i-structure-my-rest-apis-11k4
For entry point nothing is standard too : main, index, app.. Nothing mandatory...
https://nodejs.org/en/knowledge/HTTP/servers/how-to-create-a-HTTP-server/
Node.js won't exit until all open sockets (connections, ports) will be closed
The same in the case if you just connect to a database - it won't exit until you disconnect
In my sense, it is more "complete" than dead. It has everything they consider to be useful for the framework and more drastic changes will not be made because of the huge usage.
So many companies rely on it that it will get security patches, however, I would not expect any new big features anymore.
If you are searching for an alternative, I really like NestJS, it is less bare-bones and more opinionated than Express.
Hi Ryan, thanks for doing this.
npm recently hit 1.0 (congrats Isaacs), would you consider it likely that npm (or other 3rd party libraries) would eventually work their way into node core?
Have you any thoughts on SpiderNode (other than 'competition is good')? Is there anything that you particularly like (or dislike) about SpiderMonkey in comparison to V8?
Because of the hype, basically. The times when Node.js and MongoDB became buzzwords overlapped enough to make the combination an "obvious" choice. Especially with MongoDB using JSON for everything (even as a query language).
Basically what made MongoDB so successful is that it was one of the first widely-recognized NoSQL solutions in a time when everybody thought they had "Big Data" and needed a database that was built for scalability first (and integrity last).
Additionally "MEAN" provided a nice buzzword and now the idea that Node.js and MongoDB go together perfectly is stuck in everyone's head and the media (i.e. bloggers and tech websites) perpetuates it because they're too lazy to do the research to come up with someone more practical.
I would strongly recommend against MongoDB, especially as a "default" when you don't know your actual requirements. There are plenty of viable databases (non-relational and relational alike) and by all likelihood MongoDB is not the best fit for your project's requirements.
Personally, I prefer ArangoDB these days (disclosure: I've become a maintainer because I was already using it and interested in working on it). But I'd say for most projects PostgreSQL is a good default. Of course nothing beats actually doing your homework and seeing what database fits your project best before you pick it.
body-parser
is pretty awful and is used by many node applications. Send it incorrect JSON inputs and it'll error out, potentially killing the application if not handled correctly.
Sending node a burst of concurrent connections along a request path that will cause a lot of IO events to trigger may also cripple node, as the standard library (which a lot of developers utilise) is generally blocking: https://nodejs.org/en/docs/guides/blocking-vs-non-blocking/
If you're trying to be reeeeally crafty, a major weakness in the NPM ecosystem was exploited to effectively install malware into a package that was a dependency of countless other packages causing it to be downloaded if npm install
was run during the time it was live. This ecosystem is incredibly intertwined with pretty bad security practices. If you had a target application in mind, you could try to figure out a particular node module they were using, find out all of its dependencies (and dependencies of dependencies), and somewhere along the line you attack the weakest link to publish a malicious version of a package that is eventually installed onto the server in question (as well as likely many others!).
There's a lot of others out there but that should give you some starters :)
Most hosting sites don't let you execute processes but instead give you access to a folder where you can create files that get parsed and served by an Apache type server. This allows you to easily make a website out of html files which just get served to the browser or php files which get parsed and run generating a static file which gets served to the browser. NodeJS apps run as their own process and need to be executed differently resulting in the need for other hosting sites like nodejitsu.
I highly recommend using Digital Ocean as it not only lets you run nodejs apps but it gives you access to your own virtual private server which is a machine that you can install whatever os you want on. You can then ssh into in and run nodeJS apps as if you were on your own computer. It's also cheaper than most of the other hosting companies I have seen since the starting price is only $5/month. Virtual private servers also teach you more because you learn how to setup your app yourself.
Short answer: Wait six months.
Long answer: Read https://nodejs.org/en/about/releases/
tl;dr: Only use even #'ed releases in production. And wait until they reach "Active" status (6 months after initial release, gives library authors time to add support)
If they do it, just send them a link to pnpm.
pnpm is a Node.js package manager that uses a global storage and hard links to link the packages from the global storage.
>One version of a package is saved only ever once on a disk. So you save dozens of gigabytes of disk space!
It exists for several years already, is well maintained and used by some big projects.
Main difference is in the middleware mechanism.
Express uses callbacks, so middlewares are called separately. finish 1, next, finish 2, next etc.
With Koa the second middleware is awaited inside the first middleware. Finish middleware 1, await middleware 2 from inside middleware 1, finish middleware 2, await middleware 3 from the inside of middleware 2.
When all middlewares resolve you are going to to back to middleware 1.
The style is different, you have to know what is going on, but in practice it does not change much.
Promises are awesome and you should prefer koa. But consider that express is more mature and has been working really well for many people.
Promises might cause a bit more memory usage, it shouldn't be noticiable. I might be wrong but i think koa keeps more stuff in memory, since middleware context is maintained until the response is sent, but because it's truly async uses the processor better. I am speculating here, I have no proof.
Edit: Article I found says koa is faster. You shouldn't trust articles that benchmark node servers with 1 core i7 on a Windows 10 with an Ubuntu VM without refering to memory usage. Usually people like to deploy node servers in nano or micro AWS instances which are not much better than a dedicated raspberry pi 2. I bet that he doesn't even enabled production settings for both of them so that test is worth nothing and should be taken with a mountain of salt. Express has alot more stuff going on than koa, and without production settings it might be doing a lot more things to ease your development experience.
Yes. There are two major things I want to accomplish before I call it 1.0.
Run really well on Windows with IOCP. This effort is underway https://github.com/joyent/liboio
Flesh out the "multi-node" story. This is more ambiguous - but I think we can add a lot of value by showing people how to hook multiple instances of Node together to form networks. The non-blocking async nature of Node means these nodes will be able to recv and send messages independent of what's going on elsewhere in the process. I think this will be really interesting and fun.
Really awesome article. It's interesting for a few reasons:
It's a great post which explains a lot about Express routing, and things to look out for (easy gotchas).
You might also find the HackerNews comments useful on this article (lots of good discussion): https://news.ycombinator.com/item?id=8631022
I sent a similar email about my own concerns. While I don’t disagree that Reddit has a toxic element of anonymous users, I am especially troubled that they would characterize all 250m users as such, or even the 32k users of the Node subreddit. I do not appreciate being categorically referred to in the manner that this employee has chosen to, and I felt it in the community’s best interest to make my feelings on the matter known to those that she represents.
I am not an anonymous user. My profiles are public and I know that at least the COO read my email as my LinkedIn account was visited minutes later. I have no reason to hide myself. I am expressing plainly that the way this person chooses to interact with people is hateful and vile and should not be tolerated within an inclusive community. The fact that the employee even spoke on behalf of npm, Inc. and Node to inform the public that their opinions are not respected and are actually laughed at is especially hateful, disgusting, and divisive behavior.
I received a form response, which is to be expected, but I sincerely hope that npm, Inc. provides an official response in regard to the antics of this individual. That is troubling behavior for a company and it causes true concern for me and people I have discussed this manner with privately. The future of the Node community is looking pretty grim if this is where it’s heading.
Edit: If you care about the future of Node then I suggest joining the foundation as an individual member. It’s $100 to join, $25 if you’re a student, and free for contributors to Node. https://nodejs.org/en/foundation/members/
Install and use body-parser. var bodyParser = require("body-parser"); app.use(bodyParser.urlencoded({extended:true})); app.use(bodyParser.json()); // To accept data in the form of json.
If you have an input of:
<input placheolder="Select a number" name="userNumber" />
The data will come in the form of:
req.body; // All form data. req.body.userNumber; // The user number.
express documentation body parser You might want to watch the node.js series from the net ninja channel on youtube.
Personally, I use docker for practically everything.
I'd run nginx in a container to provide HTTPS (probably need a certbot container as well if you're going to use a free certificate from letsencrypt.org) and reverse proxy to your node app, and then dockerize your node app https://nodejs.org/en/docs/guides/nodejs-docker-webapp/
PostgreSQL has had pretty decent JSON support since 9.1, and it's been getting better since then: http://www.postgresql.org/docs/9.4/static/functions-json.html
Adding a JSON field into a relational table can be pretty good approach to a number of tricky scenarios, especially avoiding pivots for settings and the like. Array support in PostgreSQL is also very nice.
The concept of microservices, though simple at heart, is a big chapter and will only raise more questions than you can imagine. As a junior dev you should really focus on monoliths. They are not as dead as some "tech bloggers" make it seem. As a matter of fact, for most projects, it's wiser to start with a monolith and move to a microservice architecture if and when scalability becomes a problem.
However, if you still want to explore this territory, know that the code is not as important as the architecture. The code is mostly things you do in a monolith ie. REST/GraphQL APIs, Queues, etc. So, I would suggest grabbing a book, such as Monolith to Microservices by Sam Newman, to learn how to go about breaking up a monolith into individual services.
After that, you'll need to know how to manage the source code (ie. multi/mono repos) and how to deploy and manage the services (ie. containers, docker swarm/kubernetes, etc).
String and date formatting are part of the JS standard lib.
Streams are part of the Node.js standard lib https://nodejs.org/dist/latest-v10.x/docs/api/stream.html
The current "standard" of Deno modules has been to have a deps.ts
file where all dependencies are listed:
export { Response, serve, ServerRequest } from "https://deno.land/[email protected]/http/server.ts";
export { basename, extname, join, isAbsolute, normalize, parse, resolve, sep } from "https://deno.land/[email protected]/fs/path/mod.ts";
And then in the rest of your code you would import from that deps.ts
file. For example:
import { join } from './deps.ts'
The issue that we had was production profiling/monitoring. We used meteorjs and there was a memory leak that one of our jounior developers wrote. it took us too long to find and fix. now we use trace[1] it helped a lot.
If you care about the future of Node then I suggest joining the foundation as an individual member. It’s $100 to join, $25 if you’re a student, and free for contributors to Node.
Um, looks like reddit isn't displaying the 11 comments on this post. I'm sure somebody already recommended Express, which is not outdated and doesn't have too much needless stuff if you're looking to write a web server.
Great tips, but most of it isn't really specific to Node or Javascript. This is mostly a stripped-down version of the advice in the book Clean Code, which is a must-read for anyone who hasn't read it already.
Yes, GUIs always use event loops and node is good at scripting event loops. There is already a node-gtk module https://github.com/Tim-Smart/node-gtk and http://www.plask.org/ (using v8/node to script Cocoa/Skia visualizations). I'm interested to see how Node can be used to build mobile apps.
I hadn't heard about Deno before today.
So, based on my read, this is a redesign of Node using modern concepts that didn't exist when Node was first created?
Could be potentially useful.
But as I read further... I see a glaring issue that will keep me from ever using it: importing directly from urls.
Horrible, horrible idea. Even their suggested fix is absolutely horrible for production level code.
My take: Deno appears to be a fine for sysadmins to build scripts that are more complex than a shell script can handle, but I sincerely do not see it ever being used for major applications such as web or even CLI applications.
Remember (per the docs), odd-numbered releases are not meant to be used in production environments. They are the “beta but not beta” releases, which will not get long term support.
> There are a number of libraries that do the same thing. E.g. MySQL/Postgres client. How can an outsider, developer who's not watching the mailing list closely, choose the "right" one?
Hopefully that will be solved over time as Node matures. We've discussed having a "recommended modules" page - but perhaps it's still too young. For now ask on IRC.
> To backend developer who are not familiar with javascript ecosystem, getting started in node.js can be quite confusing. Can you explain what are the role of CommonJS, underscore.js, and backbone.js as part of the development stack?
Consider CommonJS extinct - not worth thinking further about. That was a 2009 thing. Underscore is a nice utility library. backbone.js gives you a nice client side model layer.
> It isn't obvious to me what is the best way to share the same javascript code between backend and front end. Any pointer/example?
Check out dnode https://github.com/substack/dnode I think he's got some nice examples. In general client-side and server-side don't tend to do the same thing, so there's not much to share.
> How to separate the code cleanly between model logic and the request handlers if the final callback has to always deal with the requests' response?
That's a better question for IRC.
> Are you going to be at JSConf?
Yep, I'm here in Portland now.
Vercel has worked very well for us for free, and on top of it provides a full frontend CDN deployment and API serverless deployment automatically on every GitHub commit—it will also allow you to do so for pull requests so you can have separate deployments on commits not on the master branch, say, for staging and development. Super easy to set up as well.
Well then it's not static, isn't it.
Anyways, if you want to avoid Express, you have two alternatives:
Don't use a framework. Use Node low level API which for simple things should suffice.
Use a different framework. There are plenty of alternatives out there.
Using a framework, and more generally, having dependencies is a trade off. Dependencies will save you writing software for problems that are already solved. On the other hand, you tie your program to other things that are out of your hand.
It is true that with big dependencies like frameworks, sometimes "you wanted a banana but what you got was a gorilla holding the banana and the entire jungle."
There is here not a simple answer. You'll have to think what do you want to accomplish, and consider carefully the options you have.
What about gitlab.com? They integrate gitlab-ci even in their free service. Or you can host gitlab(-ci) yourself if the free service of gitlab.com is not enough for you ..
A 4 digit number has a small entropy, most OTPs are 6 digits. If you go ahead with 4 digits make sure the validation period is very short (15min or so). Math.random() should be fine but you can take it a step further and use crypto.randomInt.
One thing you definitely want to do is strongly secure the login endpoint against brute force attempts. With a 4 digit number it's very easy to make a correct guess. Add an exponential rate limiter that kicks in after 3 wrong tries. Block the account if it had more than x wrong tries in a short period and send a recovery link. It also helps to keep a record of the user's usual login location and if you notice a login attempt from a different location, don't allow OTP and use SSO instead.
WS is faster than socket.io (which happens to also offer polling fallback for non-WebSocket clients) but they both accomplish the same task. Both are capable of real-time communication, and the biggest limiting factor is likely network and CPU overhead, not the WebSocket abstraction layer. Agar.io is not a high-performance game. Having a working game is sometimes more important than slower development as hardware is cheap (compared to time). Professional game backends use lower-level languages, not Node.js in the first place.
You're comparing the wrong things here, Rails is a framework that runs on top of Ruby. Node.js is not a framework, it is simply a thing that lets you run javascript outside a browser. The closest equivalent to Rails would be something like Express.js, or Koa.js , or Next.js.
You can use most ES6 features in Node 4 without babel. When you find one that isn't supported then you can start using babel or wait until the feature is added to V8 and Node OR you can use flags on the node
command to enable features: https://nodejs.org/en/docs/es6/
I have been using babel
because I want async/await
which is still just a proposal. With babel
I use npm i --save babel-runtime
and babel --optional runtime --stage 2
for compiling.
Yes, they do.
With more cores the OS can use them to perform other tasks while node is busy in just one core, so overall you get some benefit.
But the right solution in this case is launch multiple node processes (https://nodejs.org/api/cluster.html) and use as much cores as you want.
Just for clarification: "draxt is a jQuery-like utility module for selecting and manipulating file system's objects in node.js environment. It uses glob patterns as it's "selector engine". draxt also provides several DOM-like interfaces representing file system's objects that use promisified fs module's APIs."
Meteor is pretty stable from what I've seen.
Sails.js is a popular and stable Node framework. I haven't used it, but from what I've seen it's up to snuff. It doesn't come with out of the box testing, but it does have a section in its documentation about testing, and details the process for installing mocha and setting it up. It seems very straightforward.
NestJS is another Node framework that's newer than Sails (IIRC), but I much prefer it. It's takes inspiration from Angular, if you're familiar with that. Nest comes with a CLI for scaffolding projects quickly, and they also have their own testing packages that compliment it, making testing far more integrated than Sails does.
I prefer Nest, and I have a feeling that you might like it over Sails. Either way, what makes Sails or Nest or Meteor / Node or Rails good is what you're comfortable with. Both Node and Ruby on Rails are very capable of achieving what you'd like. There are pros and cons to each, and choosing what you are most comfortable with here will be the better choice for you.
If you're just looking to learn Node for the sake of learning, I suggest picking a framework and just trying it out.
You could just enumerate the teams a user is a member of in the claims portion of the JWT. When validating the JWT, check to see if the current team being accessed is part of the JWT claim.
​
https://auth0.com/docs/tokens/json-web-tokens/json-web-token-claims
It isn't related to Typescript, but to Webpack. Hot module reloading is undecidable in general and you have to write code to cover edge cases. https://webpack.js.org/concepts/hot-module-replacement/
https://webpack.js.org/guides/hot-module-replacement/#gotchas
I use Digital Ocean as well, but there is another option that has a free tier: https://www.openshift.com/. If you are concerned about your app going idle, you could use something like uptimerobot.com to keep it alive.
Edit: It looks like OpenShift is in the process of moving to a new platform and while that is happening you can't open an account with their existing platform and their new platform is in beta so you only get 30 days :(
> I need the flexibility provided by mongoose documents.
Look into Postgres — it has JSON support with efficient storage, lookups and full-blown indexing. I believe it even outperforms mongo v2, both in storage and performance, not sure how mongo v3 compares though.
Run postgresql over ssl and use ssl certificate authentication.
http://www.postgresql.org/docs/9.1/static/ssl-tcp.html
(though i would probably hit haproxy on the localhost and use haproxy to run the SSL certificate auth to the DB servers. )
I don't think you will have any pain using Node.js v6 since it is stable and well-tested and performance continues to be improved. I haven't noticed any significant package compatibility problems. In addition, Node.js v6 will be the next LTS release. If you wanted to play it really "safe" then Node.js v4 would be a reasonable alternative. More details: https://nodejs.org/en/blog/release/v6.0.0/
Make sure you always install packages with the --save
option to ensure they get installed when you run npm install
the next time. For example:
npm install some-awesome-package --save
I've not run into issues where deleting node_modules/
and running npm install
breaks things.
npm shrinkwrap
is recommend when you have verified everything is working and are preparing to deploy to production since it will lock down exact versions.
Hope that helps.
It's the first release since it joined the node foundation's incubator program, and a surprisingly long time for a major node project to go without a release. Express has been struggling a bit lately, and hopefully this release will help to assuage our fears.
A co-worker that uses WebStorm said if you follow these steps it will run much faster.
I've been using Sublime mostly. Visual Studio Code runs pretty good on my Linux box; which is where I do my coding.
https://nodebb.org is the best one out there right now... There are few smaller projects, too but not that good. NodeBB is free open-source and also paid SaaS. Easy to use next generation forum software.
In fact, it turns out the actual book Clean Code also suggests using polymorphism, and it explicitly warns against using if/else type checking like OP did.
So that raises an important detail to keep in mind... Just because someone on the Internet claims to have adapted Clean Code doesn't mean they adapted it well. We'd probably all be better off if we just read the actual Clean Code book.
/u/fagnerbrack
Regarding multi-threading, I was kind of surprised how easy it was to convert my first serious Node.js application from single-threaded to multi-threaded with the inbuilt Cluster functionality. I was expecting complicated load balancing, resource sharing problems and memory access violations. Instead I saw my app running on 8 separate threads without any changes! The only thing I had to refactor was the socket.io module - I had to introduce a state-sharing mechanism, but it was pretty easy to achieve with Redis. I guess my expectations were kind of biased because my last multi-threaded app was written in C++ and used pure std::thread.
Your link to the Node docs for worker threads is a 404:
> A tiny wrapper for turning Node.js threads in easy-to-use routines for CPU-bound.
I think you probably meant this to be: https://nodejs.org/api/worker_threads.html
> jwt
You are conflating "authentication" and "session/grant" management. Passportjs is designed for "authentication". JWT is designed for "grant" management.
Difference:
If you look at JWT documentation by Auth0 here, all use cases are after successfully login (... When a user successfully logs ... Once a user is successfully logged in ... ).
So you still need to figure out how you going to "authenticate".
What's up with the lazy question posts? 5 seconds to google:
Would have been less effort than posting on reddit.
Depending on your usage, using buffer's instead of typed arrays might be possible. Their api's are now virtually identical, with a couple of caveats that may or may not work for your specific usage.
Replacing the new Uint8Array(1000000)
in your code with Buffer.alloc(1000000)
:
let a = Buffer.alloc(1000000) const start = process.hrtime(); const runs = 1000; for (let i = 0; i < runs; i++) { a.fill(0); } let ms = to_ms(process.hrtime(start)); console.log(ms, "ms,", runs / ms, "op/ms");
function to_ms(hr) { return (hr[0] * 1e9 + hr[1]) / 1e6; }
Results in about a 50x performance improvement:
37.491576 'ms,' 26.672658412652485 'op/ms' 39.846483 'ms,' 25.09631778543667 'op/ms' 39.137584 'ms,' 25.55088735165666 'op/ms'
In NodeJS and other CommonJS environments, a file's code is already implicitly wrapped in an IFFE, so doing so yourself is generally unnecessary. Plus, there are ways to get around function scope anyway even if you do.
With all of the dogma over threads and event loops, I'm very surprised that no one has mentioned the cluster module.
For anyone familiar with multiprocessing, it's a tad confusing how it duplicates sockets in children as though they are threads, but it's the way to go if you want to saturate all of the cores in the host.
I regularly see confusion in this sub regarding node, async, and how it relates to cores, threads, processes, HTTP servers, and even other platforms. If anyone is interested, I'm considering giving a talk (or two) at Nodevember this year.
>why prevent me from connecting to other sites?
Let's consider the options for what node could do in this situation:
Node chose option 2, and I think this a reasonable, better, safer option. You aren't being prevented from using RC4 at all; it's just turned off by default. You can easily override the accepted ciphers globally with a command-line argument (as you mentioned), or for an individual connection by passing the options
object to the corresponding <code>tls.connect</code> call.
I could see your argument if node completely did not allow any RC4 use, but that's not the case here. It disables them by default, which is what virtually everyone would want, and for the few people who still regrettably need to use RC4, they can enable it with a connection option or a command-line argument.
Node is not tied to single-threaded apps (though it used to be when it first came out, and when multiple thread support was released it was kinda a "silent" thing, even though it's huge). You can write multi-threaded code, it's just more work in that you have to manage the communication channels between threads.
The docs, and there are multiple libraries that handle much of the management for you.
> I’ve torrented like 50 e-books
I hope they were free e-Books. Here is an Amazon link to purchase a book called Node.js 8 the Right Way which is a great resource in my opinion, and written very recently (programming books tend to go out of date rather quickly). Node.js is up to version 10 now, but 8 is still relevant and has all the new async/await stuff in it.
> I am betting that I can learn it well enough in 5 days to pass a technical interview.
Be warned that companies may ask you what you've built with Node.js, and ask to see your GitHub profile. My recommendation is to actually build something cool and open source it. Then you have something to show, and companies can have a look at your coding style as well.
This library is more or less a thin SDK wrapper around their rest api. The create method you’re calling makes a POST to the API endpoint below, using the object you pass in as the request parameters
From the docs,
> The correct use of 'uncaughtException' is to perform synchronous cleanup of allocated resources (e.g. file descriptors, handles, etc) before shutting down the process.
https://nodejs.org/api/process.html#process\_event\_uncaughtexception
There's also a ton of warnings about an application entering an undefined state -- I would imagine any async operations you perform there have a possibility to not succeed.
You can run Node on multiple cores, for most scenarios, via a child processes or a process manager like PM2
If your app is small you can easily host both the rest and client server on the same machine. Your react client app is going to use a very small amount of memory and processor resources.
To serve the node app, your best bet is to put nginx in front of it. Nginx will listen on port 80, your node app listens on port 3000, and then you proxy pass everything at 80 to port 3000.
Instead of using node directly in production, it's better to use a process manager. I prefer pm2 but forever also works well. Either of these will restart your app if it goes down.
Here's a good tutorial on using this setup from Digital Ocean. It works exactly the same way for AWS.
What the fuck is this? Is this some kind of automated testing meta? Both "Kalisha Campbell" and the article writer "Mark Hendersog" with 1 follower are quite obviously fake profiles with photos taken straight from https://thispersondoesnotexist.com. And then the article is the most generic buzz word salad that has literally nothing to do with Twitter.
Boooo. Reported for spam.
Hi, just wanted to thank you for your excellent HTTP Parser: https://github.com/ry/http-parser
I actually use it with LuaJIT and not JS to write asynchronous servers, as Lua's type system and lexical scoping always seemed a bit saner to me.
Docker via Dokku, is the only way to go imo. There is even a one click installer for DO. Shoot me a pm if you need any help.
https://www.digitalocean.com/community/articles/how-to-use-the-digitalocean-dokku-application http://progrium.com/blog/2013/06/19/dokku-the-smallest-paas-implementation-youve-ever-seen/
Dokku is a self hosted mini-heroku.
Your question regarding relation data is the crux of the question, "Should I use Mongo for this project?". The short answer is that if you think your data is relational, then you should run screaming from Mongo and document-based data stores.
I helped build WellnessFX.com and it was the first time I was asked to write a data migration from Mongo to MySQL; eventually it became obvious that our data on that project was relational and we were just bending over backwards in the code to make it do what we needed.
Read this and don't let stuck on the title; she makes an excellent set of points that I have found to me true. http://www.sarahmei.com/blog/2013/11/11/why-you-should-never-use-mongodb/
What I could imagine working well is having a RDBMS for your primary datastore that "publishes" data to Mongo. Your web tier would write to the relational database and read from Mongo. I imagine that this could be pretty high performance excepting the writes (which still would be fast but less fast than a fetch from Mongo). https://stripe.com/blog/announcing-mosql
Mongo has its place but storing relational data is NOT it.
There is no dns record for the www
subdomain. I checked both A
and CNAME
records from the site below. Without the www
it can see the A
record just fine.
https://mxtoolbox.com/SuperTool.aspx?action=cname%3awww.mcminnclinic.com&run=toolpage
The reason you would generally use nginx as a reverse proxy is balancing your server load (distributing requests across all your server instances). Increased security (you can easily set up nginx to serve certificates as discussed in the article you linked, as well as nginx handles certain types of ddos attacks better than other web servers). And of course you get all the logs centralized in one place.
​
You can read more about the topic Here
To expand on this a little, there are pros and cons for cookie based sessions and trade offs between security and convenience. It's not totally black and white and will likely depend on what you're looking to accomplish. I found this comparison useful: https://www.section.io/engineering-education/cookie-vs-token-authentication/
No, you are not required to use cheerio for web scraping, at all. You can use request() and parse the HTML yourself... but, realize this: NodeJS is not a browser, it doesn't understand HTML or the DOM. People use cheerio because it makes their life easier when web scraping. It's not a full version jquery, just the DOM parsing parsing stuff.
Also, if you are scraping more modern websites, where data comes from APIs instead of being rendered on the server, you might want to look into puppeteer (https://github.com/GoogleChrome/puppeteer). Think of it as a script-able "headless" browser. I have stopped using request+cheerio and moved my scrapers over to it.
Don't focus on learning a technology, focus on learning to build good software and pick up the technologies along the way.
The best developers I've hired aren't great because they walked into the job knowing our tech stack (especially at the junior level). They were great hires because they could learn quickly and apply what they learned.
My suggestion is to sign up for a free account on Heroku (https://www.heroku.com/free) and build something - anything. At every step when you need to make a decision like which database to go with or how to build an API just watch some YouTube videos describing pros/cons and try reading some academic literature on the subject.
Most importantly, pick something and start building. Then once you realize you could have done better or there's some other cool approach you want to try (like serverless), start over again and make it better. The key is to consider your project a part-time job and make sure to put at least one hour into it each day.
There is no better experience than a combination of academic knowledge (why) and practical application (how).
It sounds like it probably installed supporting modules and build tools.
Here's a line from the ReadMe for create-react-app
:
> If you want to try React without hundreds of transitive build tool dependencies, consider using a single HTML file or an online sandbox instead.
It's actually a great restApi program, you can chain requests so you can log in then pass the response to another request.
You can have multiple environment ments so you can have one set of routes that will change their target based on what you pick.
At work I have environments for localhost, our development server and production.
Yep, Node even tells us so. If it's stable, you don't need to use a flag. harmony flagged items aren't even "Staged" yet.
> All shipping features, which V8 considers stable, are turned on by default on Node.js and do NOT require any kind of runtime flag.
>Staged features,which are almost-completed features that are not considered stable by the V8 team, require a runtime flag: --es_staging (or its synonym, --harmony).
>In progress features can be activated individually by their respective harmony flag (e.g. --harmony_destructuring), although this is highly discouraged unless for testing purposes.
node already has a standard library: https://nodejs.org/api/
For module discovery, focused packages are easier to find than all-in-one grab bags: https://github.com/substack/node-mkdirp/issues/17#issuecomment-5863086
Documentation, tests, and contributing is also easier with separate packages when the versions of packages can iterate independently of one another. With a grab-bag standard library, if one function changes its signature in a breaking way, the entire package should update its major version, even though each function is relatively independent.
From what I've seen in sprawling "batteries included" approaches in other languages, standard libraries are where software goes to die. Bugs calcify into features and you can't ever break an API because versions are completely coupled. Better libraries emerge in the ecosystem, but the standard library has to either ignore them to focus on stability or take in every new idea, breaking frequently and having a meaningless release cycle.
If you want to introduce curation, using the module loader is a terrible idea. There are so many other better ways.
First of all, great work. Keep coding, keep learning! Don't let anyone bring down what you've done here. At the bare minimum, you've learned more node code and made yourself a better developer.
Now, as for feedback, I suggest taking a look at git aliases.
For example your code:
gg c Updated example in README
is written the same in native git as
git commit -am "Update example in README"
With git aliases, you can set ci
to be short for commit
. This is what most people do. For example, the above would simplify to:
git ci -am "Update example in README"
Either way, good work and keep working on it! Also, check out git extras.
Bests :)
You could try atom again with some packages:
I personally am not using any of these, but it might be stuff like that you're looking for.
I am fairly certain the authors of both the original and final versions of that code have not held a copy of Clean Code, much less read them.
Stop trying to be so goddamned clever and use your brain for something meaningful for once.
In your own words, and without benefit of a whiteboard, what does the function do? Okay, now make it look like that's what it does. if what it does doesn't make sense, then maybe that method isn't worth having. Back up. What are you trying to accomplish? Are there sensible functions that lead to that goal? Okay, do that.
Now, with the remaining untapped mental faculties left over after all that "boring" work, why don't you go solve a real problem? Preferably one you made that you've been bullshitting everyone about how you "don't have time to fix it". Because clearly you do, but you'd rather spend it having fun at everyone else's expense.
(This may or may not be aimed at some coworkers who write like the author)
That's the results of `util.format` which is a format that nodejs uses for logging objects via the console.
https://nodejs.org/docs/latest-v14.x/api/util.html#util_util_format_format_args
It seems everyone is getting errors. Are you open sourcing the code? If so, others might be able to help debug.
Review the new Checkout docs at https://stripe.com/docs/payments/checkout. Checkout changed in April
The error is thrown from your resulting JS code. Compileroptions > module is where your problem is I think. Your target is ES5 but you're specifying esnext modules. Set it to commonjs and you should be good to go.
As mentioned, you're only serving the html file right now. When a browser makes a request for the index, that's all they get on the initial page load. After the index.html file is received, it then starts getting parsed by the browser from top to bottom. Whenever it sees that it needs a new resource(like a .css or .js file), it has to make a new request to the server to retrieve it. In this case, the browser is making a request to localhost:3000/ to get the HTML, then the HTML tells the browser to send another request for your CSS at say localhost:3000/css/styles.css, or wherever you're putting it. If you go into the developer console and view the "Network" tab you can see all this happen on page load.
If you don't want to use a framework like express or hapi, you're going to need to create a control flow for your routing to determine which assets to serve. See the http.IncomingMessage class documentation, specifically message.url to see how to access the URL on the request.
It actually has.
Previously nodejs was able to load ESM if you file was named with the .MJS extension (.JS was loading module through commonJS).
In Node12 you can activate the --experimental-modules flag. JS files will load ESM module and if you still need to use require() you can name your file with the .cjs extension.
Their is some limitation like __dirname doesn't working in ES module, or some package not correctly bundled yet, but mostly it work correctly.
https://nodejs.org/api/esm.html
Node clustering mode requires additional code in your main app entry point: https://nodejs.org/api/cluster.html
Multiple execution requires app server (like PM2 or phusion passenger or caprover) with the appropriate configuration turned on.
Which one is better? If you just need to have multiple instances because you're expecting a lot of traffic - multiple execution mode will do the trick - it is simpler to deploy and app server does everything for you. If you want to have full control over what are the responsibilities of each nodejs workers - implement clustering. Clustering gives (in principle) more power. There are a few examples of using clustering node in documentations (like messaging or server share).
If target path is guard path or subpath, permit.
Otherwise deny.
Never try to parse the path itself. There's so many ridiculous ways to get around that
Just compare the end result to the jail.
The easiest approach is path.relative/2
Here's the main Node.js source code file: https://github.com/nodejs/node/blob/master/src/node.cc
Line 185 and 186 - you can see some variables there for thread pool sizes. Line 2078 calls "is_main_thread()" - implying there is more than one. Line 3194 creates an actual thread.
Simplifying this a bit, but Node.js works like this: asynchronous operations can spawn new threads. For example, a network socket. These operations run in parallel to the main Node.js thread. When the worker threads need to execute a callback or Promise in the main thread, they post a message to a queue. Node.js' main thread constantly processes that queue and executes JavaScript in the main thread.
Some people even write their own Node.js addons in C or C++ to do multithreading in this manner for performance and scalability reasons.
One of the main components of Node.js is a library called LibUV - a cross-platform multithreading library. It handles a lot of the "magic" described above. Take a look at https://nodejs.org/en/docs/guides/dont-block-the-event-loop/ Under the section "Why should I avoid blocking the Event Loop and the Worker Pool?" they describe the internal threadpool.
I think what most articles on the internet mean to state when they say "Node.js is single-threaded" is that the JavaScript code that runs inside of Node.js runs on a single thread. For most people that's where the largest chunk of their processing occurs, but I've worked on applications where that isn't true. As an example, we've tuned the thread pool sizes to optimize for a larger number of database connections in production environments.
Worker threads are implemented in the latest version of node (10.5.0) behind the --experimental-worker
flag.
Don't know how hard setting up a SQL database vs Mongo is really challenging; but I've honestly never had trouble installing postgres. Locally (OSX) I use http://postgresapp.com/ which works like a charm. Plus there's ActiveRecord since you mentioned using pg gems, and that basically makes your interaction with SQL to almost minimum.
Apparently is uses this: https://github.com/mafintosh/peerflix which uses node.js
I had no idea you could use node.js to make desktop applications. Can anybody point me in the right direction to learn how to make an app like this?
If you don't need to vary the temperature readings, then ab is a good choice.
ab -c 10000 http://<your-site>
Apache Benchmark is installed with Apache usually, but you can also get it as a standalone.
One new development I'm very excited about is KoaJS 2.0 which was released a month or two ago.
It's the only framework (that I've seen) that really solves the callback/promise/async confusion in node, using the new async
/await
features in node 7.0. And it happens to be a nice minimalistic framework that doesn't lock you into any specific kind of prescribed structure.
I took a quick look through your code on github and so far I'm impressed. Your code structure and organization is very modular and very easy to read and understand. It's actually one of the most organized source codes I've read for a personal project.
As to your second question, when integrating user roles, the authorization part usually involves having an express middleware check the user request and either allowing/denying that request based on the user's role.
I see you're using jwts! Awesome :) One way to do that is to bake the user/role's permission into the jwt you grant it upon login. And then have an express middleware check the jwt against a list of scope the endpoint has. There's an awesome article on it here https://auth0.com/blog/2014/12/02/using-json-web-tokens-as-api-keys/
Hope that pushes you in the right direction.
I run Plunker (http://plnkr.co) on Modulus (http://modulus.io) where zero downtime deployment is a simple 'modulus deploy', I have real-time server stats and I have automatic scaling for the odd traffic surge. Also, every issue I've ever had was quickly responded to by a competent human and resolved.
I strongly recommend Modulus.
Yeah, I see your point. I have about a dozen apps on Heroku on the free tier but none of them use up more than 18 hours.
As I read more about it I'm seeing the amount of people this will be affecting.
I was glad when I heard of thew new tiers, I just had to upgrade two of the apps to be on 24/7 and was going to be paying $35/m each. Now with they hobby tier I only gotta pay $7/each.
Have you looked into OpenShift? https://www.openshift.com/
They only offer 3 free gears/dynos per account and they won't idle unless they don't get any requests in a 24 period.
Yes, you can have Node.js create a child process and run external applications. Although it is a little above my "paygrade," Node.js in Practice by Alex Young and Marc Harter talks about this. From the summary of Chapter 8:
execFile
in cases where you just need to execute an external application. It’s fast, simple, and safer when dealing with user input.spawn
when you want to do something more with the I/O of the child process, or when you expect the process to have a large amount of output. It provides a nice streamable interface, and is also safer when dealing with user input.exec
when you want to access your shell’s facilities (pipes, redirects, blobs). Many shells allow running multiple applications in one go. Be careful with user input though, as it’s never a good idea to put untrusted input into an exec call.fork
when you want to run a Node module as a separate process. This enables computation and file descriptor handling (like an incoming socket) to be handled off the main Node process.Here is the <strong>child_process API</strong>.
You're obviously a beginner. Node.js is for the backend. If you want to do frontend, you have miriads of frameworks, unfortunatelly all of them require you to layout HTML and CSS by hand. If you want to run a simple blog without having to write code, look at the https://ghost.org/ or similar nodejs powered premade solution.
Otherwise I have to let you down- there is no solution to easily create applications withou getting dirty with html, CSS and the DOM. You might try some app like https://jetstrap.com/demo but those will not suffice for creating a web app.
The primary nodejs website has a great article on the event loop
https://nodejs.org/en/docs/guides/event-loop-timers-and-nexttick/
It sounds like you want to learn more about that? I'm not sure about task and microtask queues, I might be missing something. As far as I know everything is processed through the event loop. There is some sequencing based on the type of event (e.g. setInterval vs nextTick). That article covers that info, too.
Could you explain what you mean by "an entire interface"?
Modules are individual scripts inside a package which export something e.g. a single function, multiple functions, an object, a class. All of these modules will be available to import
or require()
within the package itself. You can restrict what is exposed by the package when it is installed as a dependency of a project or another package.
The Node.js documentation on Packages might help, especially the section on 'Package entry points': https://nodejs.org/api/packages.html
Let me know if I can help clarify anything!