Maximize the ratio of (total work accomplished)/(amount of human+computer resources).
Whichever you're most proficient in is generally going to be most efficient for you, although the general mantra is that interpreted languages like python are faster for development, all else being equal.
Whatever you're best at; clean, maintainable code can be written in either, and horrific garbage can be written in either. Do yourself a favor and pick up some pure software books ("Code Complete", "Clean Code", "Design Patterns"...) and put some effort into writing good code.
If high performance and/or distributed memory parallel programming is essential, use C++/C/Fortran. There are MPI bindings for python, but then you're kind of pushing yourself into a multi-language paradigm which I think is best avoided unless you have a really compelling need. My typical runtimes are on the order of 1000s+ of CPU-hours each running on large clusters. If you're anticipating runtimes of a couple minutes on a desktop, then development time takes precedent.
My personal process has been to prototype small (<1000 lines) codes in matlab, and do the full shebang (my PhD code came out to about 15,000 lines) in C++. I also did a lot of automation tasks (gluing mesh generators, cfd analysis, post-proc, optimization) in python as a glorified shell. I use python for a lot of post-proc and plotting now, and now that I don't have a matlab license it'll take over the quick prototyping role.
It also depends on your objective. If I'm teaching a first course in numerical methods, I'd prefer to have the students use python. If you're trying to write full 3D parallelized cfd codes capable of millions or billions of cells and do this for a living, go C++. But either way learn some of both.
I think people screw up the most in computational fluid dynamics research by forgetting the computational part. It's also the one side you've probably never had much formal training in and you won't get in grad school either. Hopefully your school puts you through some coursework on fluids and numerical methods, but they won't help you turn that into good computing skills.
Learn to automate frequent tasks. I like this xkcd for motivation: https://xkcd.com/1205/ There's so much repetitive stuff that happens in research, why keep repeating things when you can script it? This is everything from simple Unix tools like sed or awk up through automating tasks with python or something. You can save yourself an astronomical amount of time by building up a portfolio of tools right from the beginning.
Software quality is important. I don't know your advisors, but I'll bet you the quality of their code is awful. Basically all academic code is. Be a boy scout: leave code better than you found it. Try to learn what "better" actually is. There are lots of classic books out there on software engineering practices, e.g. Code Complete or Clean Code or a bunch of others. Terribly written academic codes have been crushing aspiring students since the transition away from punch cards.
Document stuff as you go. If you struggle with understanding some concept, work it out, and then type it up in a latex document and save it. Bonus points for nice figures. This will help you to learn things in the present, and save you a ton of time in the future.
I really like this one, great breadth and diagrams
The Finite Volume Method in Computational Fluid Dynamics: An Advanced Introduction with OpenFOAM® and Matlab (Fluid Mechanics and Its Applications, 113) https://www.amazon.com/dp/3319168738/ref=cm_sw_r_u_apa_glt_fabc_FXR17YZ8GRWGBB063Y0W?_encoding=UTF8&psc=1
This is a great overview by one of the pioneers.
>Polynomial reconstruction of function at specific points
The basic idea centers around this -- you're free to choose which points to reconstruct (or interpolate) from. You can construct a polynomial interpolation from points (i,i+1,i+2), or (i-1,i,i+1), or (i-2,i-1,i), all are valid ways to construct a unique quadratic. What WENO does is instead of just choosing one of those stencils, it computes all 3 and uses a weighted average of them. The weights are determined by solving some kind of function that minimizes the oscillation (or maximizes smoothness).
The advantage is that this can get you high order accuracy in the region of discontinuities by automatically biasing the stencil away from it (like upwinding with shocks) but retain stability properties.
The disadvantage is that this gets really expensive in 2D, and insanely expensive in 3D, and it doesn't play very well with unstructured grids compared to other high order methods.
Two points: > unless the problem you're simulating is fairly simple.
Solving a 1-D Burger's equation is fairly simple.
> Matlab is horribly inefficient
Written improperly, it is; With proper use of vectorization it is typically as fast or only several times slower than C. Take a look at the table with run-time comparison of languages on the Julia webpage^[1] . The problem is that most people get exposed to Matlab for a semester and make glorified calculators in it with for() loops galore. Properly vectorized, your expensive code is basically just optimized BLAS calls.
Short of production grade implementations I think Matlab is a very good choice as it is the appropriate domain specific language and permits very concise code: I can write a 1-D heat conduction solver in one line of Matlab. My entire Master's thesis on a high-order Discontinuous Galerkin solver for vortex methods works out to ~150 lines. It ran this sixth-order simulation with ~170k DOFs in 2 hours on one core of an Intel i7-3770.
_
^[1]Note: ^Their ^parse_int ^and ^rand_mat_stat ^implementations ^are ^slower ^than ^they ^should ^be ^because ^they ^didn't ^properly ^vectorize; ^I'll ^be ^submitting ^a ^pull ^request ^with ^vectorized ^versions ^that ^match ^the ^compiled ^languages' ^performance. ^The ^idiomatic ^recursive ^fibonacci ^implementation ^they ^used ^will ^be ^slow ^because ^their ^is ^a ^larger ^overhead ^for ^function ^calls, ^obviously ^a ^recursive ^approach ^is ^the ^wrong ^one ^there, ^but ^is ^a ^fair ^comparison ^nonetheless.
I think COMSOL would probably be better for this since if you're also dealing with an ionic solution and the electric field in addition to fluid flow, you're going to need to solve the Poisson-Boltzmann equation with the Navier-Stokes equation (really just Stokes flow at this length scale, but still).
Also at this length scale your Debye length is not going to be small compared to your geometry, so you can't linearize the Poission-Bolzmann equation either, so you'll have to be solving the full nonlinear equation.
I know that COMSOL can model all of that, given you have the necessary modules. I doubt that FLUENT has that, however I dropped FLUENT a couple of years ago and they may have added more physics recently, though I doubt it.
Within COMSOL, which module you would need is a little more difficult to answer. COMSOL has lots of different modules and they overlap a lot, but they do have a guide that can help you figure out what you need.
Beyond that, you're probably better contacting a COMSOL representative.
> An AMD Phenom II x6 1100T will run you $200, and outperform the i5-2500. If you need absolute performance at all costs, the Core i7s are unbeatable, but Intel chips are a pretty poor performance-per-dollar value.
This is wildly inaccurate, especially if your application is CFD. I built a 15 node, 60 core cluster this summer, and I researched and tested equipment obsessively. Can AMD cost slightly less per node? Yes, but that is the wrong question. You should be asking how many iterations per second you can buy for your money. So much more goes into this number than simply number of cores times clockspeed. Specifically, memory bandwidth and cache performance play a huge role, both of which are areas where AMD lags significantly.
Mind you it is only this clear cut for traditional CFD on unstructured meshes. If you are running on a structured mesh, or using a Lattice Boltzman code, or anything else where the algorithm can be more local, then AMD can compete. In these cases you will not be memory bandwidth limited.
The benchmarks that I ran were on 30-50 million cell unstructured meshes with OpenFOAM and Star-CCM+. I also looked at hardware review sites, which often use the Euler3d benchmark. It runs a traditional CFD algorithm on a tet mesh. I found Euler3d to be pretty representative of the results I would get on my own.
> Now I kind of want to make a spreadsheet.
Here, just use the one I made this summer. Both AMD and Intel have come out with new chips with updated architecture since I made this, so it would need to be updated.
I think the "recommended" method is to use ANSYS Remote Solver Manager, which will run a server through which you can remotely submit/monitor simulation runs. Installing RSM requires root access unless you are really creative, so this may not be an option.
Alternatively, you can use screen (http://www.gnu.org/software/screen/). It will create a login session that stays resident in the background and you can "re-connect" to the session when you log back in. You should be able to compile and install this as a non-privileged user if necessary.
freeCAD is a good open-source CAD modeling package.
Fusion360 is free for students, personal use, startups and small businesses under $100k revenue.
http://www.comsol.com/support/knowledgebase/816/
This is all you need
1 Do General form PDE
2 Set dependent variables to y and v
3 set gamma to vx and yx
4 set source term to 0 and v
5 make sure da is 0s
6 Dirichlet BC at x=1 y=-1
7 Dirichlet BC at x=2 y=0
8 flux/source BC at x=1 set g = 0 for y and g=-1 for v
and the analytical and exact solution should overlap
He's right about gmsh only writing uns grids.
From the documentation : > All the meshes produced by Gmsh are considered as “unstructured”, even if they were generated in a “structured” way (e.g., by extrusion). This implies that the mesh elements are completely defined simply by an ordered list of their nodes, and that no predefined ordering relation is assumed between any two elements.
If you have access to MATLAB you can use the FEATool Multiphysics GUI to easily accomplish this; as it comes with built-in support for 4-series NACA wing profiles, automatic mesh generation, and FEniCS solver integration (you can use FEATool to export FEniCS py scripts and call the solver, or just export the mesh as a Dolfin/FEniCS xml grid data if you prefer).
Yep. You can use a Delauney filter to wrap the point cloud and then you can go nuts on it with contour plots. You can do this in the GUI or just script it with python for your own batch post processing scripts.
According to this you can do it: http://www.paraview.org/Wiki/ParaView/Vector_Graphics_Export
I just tried it on PV4.1 and it seemed to just make a raster eps (checked it out in inkscape). I'd say try it out, I might have been setting it up wrong (export or import).
I think the xml format has more features, supporting for example data compression. Another advantage is that PVD files, which can contain time information of your vtk files, only support the xml formats.
On the other hand, the legacy format is much easier to generate, so I'd recommend you that if you don't need the extra features.
How would I get started with writing a PDE solver to run on GPUs? I've written quite a few codes from scratch (both for courses and research as a grad student working on numerical analysis/methods development) and have some experience with both shared-memory (OpenMP) and distributed-memory programming (MPI), but don't really know where to begin with GPUs.
I would probably be using an on-demand service like https://www.linode.com/products/gpu/, and would be interested in either using C++ or perhaps Python (with https://mathema.tician.de/software/pycuda/ or something like that). This would be mostly for my own learning purposes, and potentially as a bit of a side project for my PhD. Interested to hear what people would recommend regarding resources for learning how to use GPUs in scientific computing.
I think before you start investing in hardware you need to consider what you will need to fully resolve the flow. If the 100K elements does it, great, but it sounds like that is a simplified version you would like to add detail to. You can quickly outgrow a $1k machine.
Also, consider the cloud (HPC) will probably cost you around $.15/core/hour. Those cores are also not that fast generally, and you would need to use MPI.
The story is a bit different for virtual machines in the cloud, something like Amazon's EC2. https://aws.amazon.com/ec2/pricing/
If it's a one off, rare simulation, you might be best off creating a t2.micro instance which is free. Setup the environment there and get it ready. Then you can upgrade the instance to something large, like a c3.8xlarge, which has 32 cores and 80 GB ram for $3/hour. Naive core count comparison with your laptop would have it done in under 2 hours - though I'm sure the machine is faster than a mobile CPU. In theory you could get it done for <$10 and have a lot of memory available.
A very good book to start would be this:
https://www.amazon.com/Computational-Mechanics-Transfer-Physical-Processes/dp/1591690374
If you’re modeling the atmospheric boundary layer assuming neutral stability then you should be using a logarithmic velocity inlet profile that accounts for surface roughness. You also need to modify the turbulence boundary conditions as well. I would highly recommend you see this thesis by Katsanis it should be very helpful and directly discusses how to deal with your problem. Numerical Modeling of Wind Borne Pollution Dispersion Hope this helps
It is a project. I looked into student licensing but the cfd packages are still above the $500 mark. Accuracy will eventually be important but a good guestimation will work for now.
The geometry is a sort of tessalated 3D shape similar to these: https://www.shutterstock.com/image-illustration/3d-mixed-geometrical-complex-faceted-shapes-199821002.
On Android there is this app called Windtunnel (https://play.google.com/store/apps/details?id=com.algorizk.windtunnellite). I think this is exactly what you're looking for. You can draw whatever shape you want and it will quickly give you a visualization of the flow around it. The free version has only pre defined shapes though which you can play around with. Give it a try and get the full version if that's what you were looking for
Perhaps "A Voyage Through Turbulence"?
>Turbulence is widely recognized as one of the outstanding problems of the physical sciences, but it still remains only partially understood despite having attracted the sustained efforts of many leading scientists for well over a century. In A Voyage Through Turbulence we are transported through a crucial period of the history of the subject via biographies of twelve of its great personalities, starting with Osborne Reynolds and his pioneering work of the 1880s. This book will provide absorbing reading for every scientist, mathematician and engineer interested in the history and culture of turbulence, as background to the intense challenges that this universal phenomenon still presents.
Pretty straightforward way to compress your video files if you're happy with everything else and just want them smaller: handbrake
Its available for every platform under the sun really and great at re-encoding and compression video. /u/Overunderrated mentioned it buried in their reply so I'm really just leaving this here in case you missed their post.
Yes, but conditionally. Basically, drag coefficient has drag Force in numerator and velocity square in denominator. As velocity increases, their increment is similar in scale. Hence as a result drag coefficient remains “fairly” constant. But this is not valid for very high and very low velocities. This is the same reason why you find constant value of Cd for most geometries in textbooks and literature (although they do mention the Reynold number regime for applicability). Similar to what is shown in the picture on your post. For instance, Experimentally for SUV, drag coefficient is generally constant with speed above 25 mph link
Same is also observed for generic Ahmed body. Check one of the comments here that shows Cd vs Re#. (For a car with fixed geometry, Re is only a function of Velocity, so it is practically Cd vs Re#)
it was the vertical windfarm...
https://sites.google.com/site/verticalwindfarm
but you mentioned that it would be difficult to calculate the forces...
but i was thinking CFD would be able to determine if the downwash from the front-side of the loop of wings could add to the lift of the backside of the loop... also i think i mentioned a video that shows Kilimanjaro causing the hurricanes off the atlantic...
Salome is distributed without the meca bit from http://www.salome-platform.org/, while salome-meca is developed on top of this platform by the Code_Aster guys. I guess you'd use Salome on its own if you only wanted to use Code_Saturne.
For CAD: Use DesignSpark Mechanical https://www.rs-online.com/designspark/mechanical-software. This is the original ANSYS Spaceclaim and is free. You can export stl for DesignSpark and use OpenFOAM for CFD and Paraview for post processing.
I would recommend the classic DFG cylinder benchmarks where the drag and lift coefficients have been computed up to machine precision. Model set up https://www.featool.com/model-showcase/04_fluid_dynamics_03_flow_around_cylinder1 and references:
[1] John V, Matthies G. Higher-order finite element discretizations in a benchmark problem for incompressible flows. International Journal for Numerical Methods in Fluids 2001.
[2] Nabh G. On higher order methods for the stationary incompressible Navier-Stokes equations. PhD Thesis, Universitaet Heidelberg, 1998. Flow Around a Cylinder
You can parameterise inputs. Basically set input velocity as a parameterised input, drag a response surface block into your model in workbench and select whats required. That's the easiest option.
http://www.ansys.com/Products/Workflow+Technology/ANSYS+Workbench+Platform/ANSYS+DesignXplorer
Yeah it is fine (below 0.9). If you would like to know more about Meshing app, ANSYS is providing tutorials for it. You can find them on ANSYS customer portal, which is for customers with paid support. Moreover be sure that you do some sort of grid indepence study.
I don't really understand what you're trying to do... my guess is that you either want to use periodic boundary conditions, as /u/chaosbutters has mentioned or you want extrusion operators (discussed here: http://www.comsol.com/blogs/using-general-extrusion-coupling-operator-comsol-dynamic-probe/)
That's all I can give you without more information.
what type of computer are you running on if it takes 8 hours? This is a pretty standard exampleso i think youre setting something seriously wrong. Go look up this example
On http://www.paraview.org/download/ there is a drop-down list under the "Nightly build".
In this list I can see a lot of different things, including a lot of "ParaView-x.x.x-xxx-xxxxxxxx-Qt4-Windows-xxbits.exe".
Aren't those the binary executables for paraview on windows ?
try looking at the paraview guide since the stuff is in the foam user guide in relation to the older version of paraview.
i hope this is the right place for the guide
http://www.paraview.org/documentation/
> I have come across only one such simple case that involves flow between two concentric cylinders.
Taylor-Couette flow is the simpler/original version of your link.
Are you looking for something with exact solutions? Not sure if the inviscid vortex transport counts for what you're looking for.
I used this 2D solution to verify a solver I wrote: https://sites.google.com/site/randymcdermott/NS_exact_soln.pdf
This paper has what appears to be a similar solution in 3D: http://dx.doi.org/10.1002/fld.1650190502
I checked that the error was at the correct order of accuracy. It took a while to work out a few bugs but eventually my solver was spot on.
I just found Unicorn which seems to fit the bill. It hasn't had an official update in awhile, but I spoke with one of the developers who tells me that there is very active development right now, it just hasn't been released officially.
This, and also managing large (100+ page) documents is so much more simple in LaTeX. Making changes to images / charts is better, and they behave better in text. The only thing that isn't really neat / intuitive is generating large tables.
Would recommend TeXMaker as an editor: http://www.xm1math.net/texmaker/
I scribbled out some notes here along with plots of computed errors for forward euler and analytical error bounds for your problem (last figure). First half I was just recreating what is in LeVeque's book, linked there.
I see where your confusion is -- google is failing you because you're not looking for the right thing.
edit: sorry i didn't label my axes. the last plot has error on y axis, and number of steps (1 over step size) on the x axis.
tldr there is no stability limit for your problem.
I'll look if I can find some help for you.
This book is great for dimensioning and designing cyclones, you may find some good similarity in the calculation.
Before going wild on simulation I advise you to check out the physics behind it.
I am increasingly doing work in reduced order modelling, and that area overlaps with machine learning a fair amount depending on how you define the two. For refereence, I'm an applications researcher, not a code developer. From my perspective, ML is a tool I'm using to analyse my data and/or to build ROMs of fluid phenomenon. CFD is used to generate input data to tune/train the model, but is then subsequently replaced by the ROM. I don't really need the CFD and ROM/ML to interact, it's more like a post-processing step. Off the top of my head, I am not familiar with any areas where tight interaction of the CFD code and ML or ROM methods would be useful. CFD and ROM/ML are just different tools in my fluid mechanics toolbox.
I think a good starting point is this textbook by Brunton and Kutz
Computational Fluid Mechanics and Heat Transfer 2nd or 3rd Edition by John C. Tannehill, Dale A. Anderson, and Richard H. Pletcher Amazon Link
I've read through some of this book, and it's been recommended to me by my mentor. It's a great book to build a strong foundation in CFD
> Hoerner (lift)
This?
​
https://www.amazon.com/Fluid-Dynamic-Lift-Information-Aerodynamic-Hydrodynamic/dp/9998831636
>Check out Hirsch's "numerical computation of internal and external flows vol 1", it even walks through that actual example of transient flow over a cylinder.
Cheers, I'll try and get a copy.
Just wondering, do you know if this edition ("Numerical Computation of Internal and External Flows: The Fundamentals of Computational Fluid Dynamics: Vol 1") is an updated version of what you recommended (this , i.e. "Numerical computation of internal and external flows vol1: fundamentals of numerical discretisation"). The apparent change of the subtitle has me a bit worried that they don't have the same content.
>I'd say for a starting point, instead of worrying immediately about transient flow over a cylinder, first mock up a solver for a laplacian over that same geometry so you can worry about your data structures and gradients and boundary conditions first, without the added complexity of NS.
Thanks for the suggestion, I'll do that (I would have probably tried to plough ahead with NS otherwise and ran into trouble).
​
https://www.amazon.co.uk/Introduction-Computational-Fluid-Dynamics-Finite/dp/0131274988
This book is great for starting out. There are others more suited to aerodynamics but that book is a good starting point.
If you are interested in vortex methods, this is a decent book.
https://www.amazon.com/Vortex-Methods-Practice-Georges-Henri-Cottet/dp/0521621860
I briefly worked with vortex methods and I actually had to cycle between papers and theses and this book to get it working. Its actually a nice method for incompressible flows but I could never get boundary conditions for walls to work. But in say, a periodic domain it works well.
OP, if you're looking for a basic CFD book to start (other than conceptual materials for Fluid Mechanics, Heat Transfer, Thermodynamics) I strongly recommend <em>Numerical Heat Transfer and Fluid Flow</em> by Patankar. It's a bit pricy for such a small book but it's worth every penny, especially if you're looking to write your own basic code.
For high speed impact problems, the water remains incompressible, but if there are trapped air pockets, those can develop high enough pressure to become compressible, which leads to a difficult multiscale problem. Most of what I know about slamming problems comes from Flatinsen, as well as some work that colleagues were doing at my last employer. I don't know much about fluents models for these kinds of problems, I've never used it.
Buy or check out this book from a library. That will be your goto for all the cfd aspects(AKA the numerical simulation of a physical problem). Other resources you'll find will be more strictly programming based(Things like MPI, OpenMP, Fortran, etc.).
If you want a book to purchase that will give you a fantastic introduction to computational aerodynamics, let me suggest this one written by some colleagues of mine: Applied Computational Aerodynamics
If you are interested in a book on numerical solution of PDE, you should check this one out. This is the book I learned out of, and it is particularly suited to finite difference methods (but covers some of FVM and FEM and has a chapter on linear solvers).
This is a good introduction to finite volume methods.
As for books about CFD, I am not too sure, my research is on numerical solution of hyperbolic PDE (think Euler equations), and I work with a particular method (discontinuous Galerkin). But these two books should be accessible to you with your background. Read them, try to code things up. See what happens when you break stability conditions, see what happens why you do not limit a shock. These things will help you understand and interpret CFD results.
I would start with pen and paper. How comfortable are you deriving the weak form for differential equations?
Personally, I would start with that. Then check out software that will permit you to discretize the equations (such as FEniCS,etc. ).
A good text to get you going would be Oden, Becker, Carey (http://www.amazon.com/Finite-Elements-An-Introduction-Volume/dp/0133170578)
Yeah, baby steps indeed! None of us learned this stuff overnight. I think a decent number of people in this subreddit have devoted at least a year (if not more) of advanced undergrad or beginner graduate level coursework to really master the principles behind CFD solvers.
I cannot recommend Moin's <em>Fundamentals of Engineering Numerical Analysis</em> enough. It starts with the very basics (numerical integration and finite differences) and then builds all the ODE/PDE discretization techniques right on top of these basics. If you're going to be learning about PDEs this semester, and then linear algebra next, this numerical methods book from Moin will be a great companion.
By the end of your linear algebra class, you should be able to write a program that discretizes an ODE with central differencing, which creates a linear matrix system (Ax=b), which in turn is solved using some iterative method (i.e.: Gauss-Seidell). That's not exactly how CFD solvers work (they use different discretization methods) but the general workflow is identical. Doing an exercise of that sort would be very helpful, and the general framework of your code can later be upscaled to more advanced discretization techniques.
Thanks for the reply. I already have had an undergraduate course in fluids, which used Introductory Fluid Mechanics - Katz. I didn't like this book all that much because it was not very mathematical, but it gave a decent understanding of some of the simpler concepts.
I had an opportunity to take a graduate fluids course through my applied math department but chose to take other courses instead (asymptotic analysis/perturbation theory, and applied functional analysis). I am thinking that next September when it is offered, I will take it even though I will have completed my course requirements.
I used Katz for my introductory fluids course. I thought it was good book, not as mathematical as I wanted (I'm in applied math) but had a lot of useful applications. The book starts with the derivation of the NS equations then starts off with the easiest simplifications, fluid statics. Then the next chapters relax an assumption, making the model a little more difficult but a little more practical. I think this book might be a good fit for you.
You need to have a good grasp on both the physics of fluids as well as a grasp of numerical methods and how they behave (and introduce errors into the solution by means of approximation).
This is the book I learned from, with a hefty amount of supplemental instruction and then practical research and formal classes. I do not recommend learning just from a book, without guidance (at least, if you want to be good at it).
It doesn't necessarily pinpoint specific aspects of fluid dynamics, but it goes over the different PDE's you'll find in all aspects of physics computation, with diffusive PDE's, as well as "convective" type PDE's. This describes fluids pretty well (as the Navier Stokes equations are convective-diffusion partial differential equations).
What's your background? How much fluids do you know? What year are you?