I suggest using Python over C for quick maths like you showed in the screenshot since you don’t have to compile and you have access to an interactive interpreter. You can also take a look at Jupyter Notebook , it allows you to put code directly in notes.
If you've already set up Anaconda then you should be ready to go with a Jupyter notebook (https://jupyter.org/). He can write the code and execute it with the results being displayed below each cell. The command (jupyter notebook) is simple to remember and the notebook functions the same on every device (at least in my experience).
As he progresses I think you're on the right track looking at a text-editor and then moving to PyCharm :)
Actually
> [Jupyter] has support for over 40 programming languages, including those popular in Data Science such as Python, R, Julia and Scala.
Haha, how can anyone not?! :)
For stuff like this though, I only really use Excel (or LibreOffice) to prettify the output. All the number crunching and everything else is done in Jupyter notebooks.
If you are familiar with markdown, I recommend using an atom package called Markdown Preview Enhanced. You can include code snippets and run it, you can also generate a "table of contents" with it. Plenty of neat features, check it out!
Another popular option is Jupyter. I haven't gotten a chance to use it myself, but heard it's very useful once you have everything setup.
Hey, my requirements are pretty similar to yours.
If their engineers have experience with docker and AWS ECS or AWS EC2, that is preferred. I would ask for the following applications containerized through docker:
A container for a jupyter server (the server running your notebooks)
A container which starts a variety of timed cron jobs, and other data retrieval jobs. I recommend celery for retrieval of data, internal processing of data, and in general scheduled tasks. It could even replace cron (but they both can be used together if that's desired)
A mongodb server (AWS provides their own mongodb cloudformation, however we went with our own container running mongodb)
Documentation of how to run a backup of mongodb, your jupyter server, as well as the git repo.
Documentation of how to deploy updates. Generally I would recommend as follows for your workflow:
As for how much this would cost? A bare minimum setup would cost you at least a few thousand dollars. Tough to estimate without more details. Then you'll need continued support, which will cost more. Of course, this is fairly basic stuff, so a competent IT staff team should probably be able to set you up with this if it is made a priority.
> Click around site, "This is a thing that does things!"
Really?
>
> IPython provides a rich architecture for interactive computing with:
> * A powerful interactive shell. * A kernel for Jupyter. * Support for interactive data visualization and use of GUI toolkits. * Flexible, embeddable interpreters to load into your own projects. * Easy to use, high performance tools for parallel computing.
> The language-agnostic parts of IPython
>The Jupyter Notebook is an open-source web application that allows you to create and share documents that contain live code, equations, visualizations and narrative text. Uses include: data cleaning and transformation, numerical simulation, statistical modeling, data visualization, machine learning, and much more.
> "Programming should be more like Excel"
No, Excel should be more like programming. Example? Try entering a complex function that can be applied to any cell.
Python is free and very flexible. Jupyter is free and very powerful. Excel is not a programming language or tool, it is a consumer product.
Give JupyterLab a try. Your colleagues can play with the same data during your presentation if you set up a JupyterHub. Here is a demo.
NetMath's Calc 3 stopped using the "real" version of Mathematica many years ago in favor of some weird web interface. From what I know, my year was the last to use the actual Mathematica program, and that was 5 years ago.
What you're seeing on EWS is Mathematica, notebooks and all. I'd suggest you look up the documentation to learn how to use it if you want to continue using Mathematica. You could start here: http://reference.wolfram.com/language/tutorial/UsingANotebookInterface.html
Alternatively, if you want to avoid the proprietary Matlab/Mathematica nonsense, you should probably go look into something like Jupyter instead. Basically everyone but TAM uses Python/Numpy and friends (and hopefully that improves soon).
Take a look at the Jupyter, formerly Ipython notebook. It allows you to use markup and several programming languages in a Browser to create and present on a topic. Already in use by engineers and scientists as it allows you to publish/present exactly what you calculate with.
this probably doesn't belong here, but - have you checked out Jupyter? https://jupyter.org/index.html - working with pandas seems to be very very popular with jupyter - lots of charts, etc.
maybe a next step, or side step, or whatever?
A self taught course would probably be much better than college. Something like this https://www.udemy.com/course/complete-python-bootcamp/ is good. It’s often discounted to $10-20. I wouldn’t pay full price for udemy courses as they are on offer so frequently.
If you get stuck and need help also check out r/learnpython
Python won’t be a problem speed wise unless you are doing high frequency trading and need to save milliseconds and you’re working for a hedge fund etc.
Also if you need to in future you can rewrite parts of your code in C++. Python can run those too.
The ease of use and huge amount of libraries make python great for focusing on tasks rather than worrying about implementing, for example, all of the nitty gritty bits of a bot.
Many also use pandas for data analysis along with Jupiter notebook which is also great for prototyping and trying out ideas interactively.
Why do you automatically reject Tkinter or PyQT or Kivy?
It seems like you've decided what you don't want already.
Tkinter is admitedly not beautiful, but with its ttk extensions, it looks pretty decent, and it's built in to most versions of Python. It's a little awkward, but quite easy to get started with, so it's often a very good choice for beginners to GUIs and hobbyists.
PyQT is extremely popular, and can look quite good. With the QTdesigner you even have a visual tool for creating forms. The downside of PyQT is installation. It's a bit of a hassle to install, and it's difficult to make a portable app, because the library needs to be installed on the user's machine as well.
Kivy is a modern and extremely powerful tool. It looks quite good, and is very fast. It has some of the same installation issues as PyQT because it isn't built in to Python.
The easiest solution for deployment is to use a web application. There are two major solutions to this: One is to use Python as a backend framework with something like flask, bottle, or Django. With these frameworks, HTML and CSS are the front end, so your interface can be anything you can do in the front end. They can be a hassle to install, and you need a web server, but anyone can run your programs without a local version of Python. I like bottle for beginners, because it's much simpler than the other popular frameworks and can be run without a complex installation. Just drop bottle.py in your directory and you're ready to go.
With the content you're describing, you might also enjoy using a Jupyter notebook. The easiest way to play with this is to go to https://jupyter.org/try or https://kaggle.com. A jupyter notebook runs in a web page, and you can write python code directly in the page. It's ideal for graphing and data analysis.
In many cases, the visual part is independent from the language. For that, there is Jupyter Notebooks. https://jupyter.org/
It's not tied to a language, it's a protocol. It supports over 40 languages.
Saluti, tu devrais regarder Jupyter c'est parfait pour ce genre de petit bout de code en python qui fait de l'analyse de données.
Et en plus tu pourras facilement partager le résultat en le publiant en ligne.
https://jupyter.org/index.html
https://www.continuum.io/downloads
https://docs.python.org/3.4/tutorial/
Can select versions in a dropdown menu, 3.5 is the latest.
Python has several different IDE's, you can test jupyter notebook (which isn't an IDE), in the web browser too.
I'm a student so don't ask me what's best. Matlab has it's benefits but by the time i'll be an employable BA engineer i think python will be a net positive over matlab. There's a reason why our curiculum changed to exclude matlab and i'm sure it's been a couple of years in the making already.
I will also chime in (agree with everything said above), and mention that it is INCREDIBLY important to learn now how to break up your 'program' into components that you can work on separately. This is good as the program grows in size it stays maintainable, but the main reason for beginners it is key is it gives a nice positive feedback loop of actually being able to finish something tangible.
Eg before you get to the interpreation etc etc, start by saying I'm going to take a csv file that has the eeg data and I'm going to read it in and plot it (eg using matplotlib).
Then step two would be, I'm going to refactor (rewrite using your better understanding a longer term solution) that script to a function so I can read the data in, then pass it to a 'plot_eeg' function.
Then step 3 may be, ok, now I want to be able to read in 2 eegs, and make a plot to overlay to compare them. repeat step 2 (refactor after every step)
... step 4, use pandas to come up with key statistics I can use to compare (peak height, time between peaks, whatever)
....
...
Then you can eventually get to, I will turn it into app/webapp that a user can upload a csv file, it will compare metrics and make a prediction etc.
Also, as there is a reasonable deal of analysis you'll be doing as you learn, (eg plotting, doing calculations etc) you absolutely should check out https://jupyter.org/ (the successor to http://ipython.org/notebook.html)
See sagemath.org. The recommended way to go these days is to use a Jupyter notebook.
Certainly Sage includes the functionality to do public key crypto, etc.
look into jupyter notebook too. it's a popular choice for interactive python programming. you can try it online too at https://jupyter.org/try. i tend to use that for python. rstudio is definitely the way to go for r
You can install Jupyter without using Anaconda: python3 -m pip install jupyterlab
. See here.
You can also use Sublime Text to write code that runs using Anaconda's Python interpreter. You'd need to create a Sublime Text build system that points to the interpreter that's installed with Anaconda.
To find your python installations, it's helpful to know how you installed them. Then look up where the default installation is for that installation method. You can also look in a few common locations:
Before you remove any of these, make sure you know exactly which version you're affecting. You don't want to accidentally remove the (outdated) system version of Python, because it's used by important internal system tools. It's really helpful to learn how to navigate your filesystem in a terminal, because some things are easier to find in a terminal than in a file browser. It's also good to learn how to make an alias. You can then make an alias for any Python interpreter on your system, and use it just like you're currently using the python3 command.
Apropos Haddock, you quoted what was a throw-away comment that should have been left on the cutting-room floor, and apologies for the unfair and somewhat snide commentary. I removed it. Haddocks are delightful to work with and thanks!
But just in my little template alone you comment red-flags stack
, hlint
and github/actions
as low quality. In each case, the yaml decision says to me "We won't make you learn a new configuration format" rather than "Our standards are low." I format cabal files as yaml. It looks like a yaml file, and parses as one, I expect. And yet I hit <TAB> in places and it does some crazy indentation thing from the 90s, and has done so for 10 years now.
By ecosystem, I mean markdown. I will defer to your technical knowledge of it's limitations. It's a loose standard.
In getting this template together, however, I skimmed over the long history of literate programming, and it broke my literate heart. We took Knuth seriously and built in lhs and bird tracks to accompany and support the concept. We should be the home of literate programming but it's dead as a doornail.
Meanwhile, other languages took markdown and ran with it, using (and here lies the tragedy) pandoc to embed it in their rapid development processes. They've gone on and built empires off of it, such as https://jupyter.org/ and https://rmarkdown.rstudio.com/. We can't even begin to compete. We have to re-route everything through ihaskell to even get to the starting line of the rapid, general data analytics race.
There is a downside to technical excellence above all else.
Installation instructions: https://jupyter.org/install.html
For jupyterlab or notebook.
Most likely, for Windows,
py -m pip install jupyterlab
or
py -m pip install notebook
For macOS and most linux,
python3 -m pip install jupyterlab
or
python3 -m pip install notebook
I just love this plot. Definitely something I would listen to.
BTW, Jupyter Notebook is exactly what it is called. Which is going to make search suck.
"JupyterLab: Jupyter’s Next-Generation Notebook Interface"
People that work in the AI/ML field will definitely be confused.
Google's Colab is basically Jupyter with some extras, and Jupyter is available under a GNU-friendly BSD 3-clause. If you have some hardware lying around, it doesn't take too much to get set up with a notebook server. Colab notebooks are even in Jupyter format, so it'd be easy to migrate any old projects.
Python code runs through a Python interpreter, which is pretty much just an app you can install on your computer that isn't visual (e.g. no window pops up, but it can run your Python files).
In our case, we run our Python code through something called a Jupyter notebook - which is a super beginner friendly way of executing your Python code that's really good for cleaning/analyzing data, manipulating graphs, etc.
I'm not expert on Python code execution, but feel free to ask any follow ups and I'll be as helpful as I can be :)
Check out Jupyter - it might be what you're interested in. It's a development environment for Python that also allows you to write Markdown inline - and so you can present / read them visually. You can do some interactive stuff here - but not sure if it'll give you the dashboard look you're looking for :)
Jupyter is correct. https://jupyter.org/
Agree with most of the other points. I wouldn't expect tons of information on most new grad resumes. Don't list out details of the classes, just give a list of some of the most relevant courses to the job.
I'd also put academic projects first since those are more interesting and they would set you apart from other applicants at your school. This is just personal opinion though, others may disagree.
I would add a section for technical skills. You can convey some of the information you learned in those classes here. Again, just a list. I can see Python, R, SQL, Tableau, Networks. Add whatever you think is relevant to the job. As an interviewer I want to be able to glance and get an idea of what skills you have.
It's very tough to do as a student, but as you get more experience try to read this from the recruiters perspective. What did this resume just tell me? It tells me you used a lot of space to show me you took a few classes that are pretty standard and I had to go all the way to the bottom to see what sets you apart. Not all recruiters will get to the bottom before putting it aside.
First off, welcome to the python and data science communities! I understand where you're coming from. It takes quite of bit of practice/experience to get comfortable with the process and structure behind data science projects, especially if you're starting something from scratch. It's great that you have someone to deliver results to and get feedback from.
It's a bit tough without seeing what you have already to give specific advice or feedback, but generally I would suggest looking at templates to help bring some more structure into your project, even if you just use them for inspiration or a starting point. One that comes to mind is the Cookiecutter Data Science template (https://drivendata.github.io/cookiecutter-data-science/).
You may also want to look into Jupyter Notebooks (https://jupyter.org/). Depending on the type of project you're doing, it might be a good way to go if you're at the step of doing more exploratory analysis, versus developing a model or algorithm in your Python IDE. That's not to say it's one or the other, but it's worth getting familiar with Jupyter and trying it out if you haven't already. One more thing in your toolkit :)
No mires programar solo como un fin sino tambien como una herramienta. Y si, podes ir todo el camino en esa direccion, pero tambien podes usarla como para ser un mejor profesional en el rubro que sea, especialmente en economia, cuando la aplicas al procesamiento de datos.
Python en particular es bastante abierto en esto, y no me asombraria que haya cursos relacionados con economia, finanzas o cosas por el estilo que usen python o Jupyter como herramienta para analizar datos, tendencias o cosas por el estilo.
En todo caso, si queres aprender mas python tenes un monton de cursos, tutoriales, libros gratuitos y no tanto, talleres y desafios varios en ese lenguaje. Si ya diste los primeros pasos podes aprender mucho por ti mismo, ver algun proyecto no muy complejo hecho en el o hacer algo tu mismo (si no se te ocurre que, bueno, lo que dije arriba de tu carrera es un buen inicio). Suelo recomendar recopilaciones de recursos sobre temas tecnicos (p/ej en este caso una podria ser awesome Python aunque suelen estar en ingles y tener un nivel medio avanzado, pero podes encontrar bastante con busquedas de google.
Python has a pretty rich ecosystem of math, scientific, and data processing libraries. It's a pretty expressive language, and a lot of these libraries have compiled components (often C) that make math and data processing pretty quick, too.
To see a glance of what is possible, check out the examples here: https://jupyter.org/
And for a popular real world example, the Event Horizon Telescope's imaging was done mostly in Python: https://github.com/achael/eht-imaging/
Checkout SQL Server Machine Learning Service
ML obviously has a pipeline, and the data is usually stored in a SQL server. Python and R are great for ML and forecasting/modeling.
Also check out Jupyter Notebooks. Azure Data Studio (effectively VS Code but an RDMS) has that built in as well with really cool pipelines you can build out, and your choice of language.
This is really cool. I'd love to see the full script. Do you have it in an open repo you could link to? Or could you put it in a pastebin link?
On a side note, you might be interested in trying Jupyter Notebooks for scripts like this. It can give you the output in the same window where you write and run your code.
Well python has all the fancy science libraries, but IMHO ruby is a much more eloquent and expressive language.
Notable gems would be https://github.com/SciRuby
Just suck it up and learn python, it's going to be more marketable. I was really fond of https://jupyter.org/ when I was doing that kind of thing.
Seems like the way you would script it in Jupyter. Classes aren't necessary for a good Style, only if you want to write object-oriented. They probably all thougt about this as a scripting task, otherwise they had done it cleaner.
Jupyter notebook and lab are terrific tools for doing data analysis using languages like Python or R.
If ya don't know Python, this is a great place to start.
The Jupyter notebooks are just advanced REPLs where you can write notes and test different functions like I did here! Let me know if I can help!
Good question. The Jupyter site doesn't express a preference between pip and conda, apparently they regard them as equivalent. Not so for pyenv, so maybe avoid that approach.
Create a Linux partition, install your preferred distribution, then install Jupyter and whatever language kernels you find desirable. That will provide an interactive workbook environment so you can evaluate how Linux behaves with respect to the various Python-based analysis packages (as well as Java and other supported languages).
> ... however I wanted to know how is the compatibility with programs such as OriginLab; is there a way to use it without having to do a virtual machine?
If OriginLab is a Windows-only program, then you would need to think of a way to run it -- either in a VM or under Wine. But here are some alternatives to OriginLab that run under Linux.
Do you really need that specific program? There are any number of programs that support technical data analysis and graphing. Example Jupyter, which combines a multi-language environment, a widely supported notebook format, and graphing. And it's free. And it run under Linux.
As just one example, the LIGO (gravitational wave) project uses Jupyter, and even publishes their results in the form of Jupyter notebooks.
> One more thing which distro would you recomend on using?
Based on your experience level, I recommend Ubuntu. Changing distributions doesn't really change how much you learn -- that has more to do with your activities in the installed distribution.
You should also have a look at Jupyter Notebooks. They come with Anaconda or can be installed separately. They do have the advantage that you can execute parts of your code very easily. https://jupyter.org
Just looking at their respective websites (1 and 2) I would say say there is a pretty obvious response. That TeXmacs have either not spent much project time on branding, or, are determined to brand themselves as intended for the niche of open-source enthusiasts (obviously they may be other reasons, but they probably aren't too far from either of these). I would claim that the branding of Jupiter is more inclusive.
What you are describing sounds like Jupyter Notebook. You don't have to do anything on your site, you can use just the existing Notebook Viewer.
There is a truly magnificent thing called Project Jupyter. It allows you to upload python files, store them, and run online. It also has a lot of different integrations for other programming languages, but as far as you are concerned, yes, you can upload python code/files and store them in/at what's called IPython Notebooks which you can run online.
I'm going to throw out another vote for Python. It was the language taught by my department in undergrad, and should serve you well in whatever direction you take your career, at least as a stepping stone.
I'm personally a fan of the Jupyter/iPython distribution (https://jupyter.org/index.html) since that is what I learned in, but it can also be used for Julia or R, plus whatever else the community develops a kernel for. Very flexible and easy to see, share, and use, IMO.
Exist a way for this, but requiere a incredible nailed IDE to make it work.
Is still text based.
Imagine HTML (or maybe RTF, is almost the same for this argument).
HTML have separation of content, layout and style (YMWV in how good). You can have visual representation of code, and posibility to embed images, multimedia and custom widgets per-page/ide/language so you "solve" how this could get some adoption.
However, expect the "browser" to be complex, at least similar to a regular word editor.
But is not that alien of concept, it have worked before (it other context) and we have a close concept, truly sucesfully implementation, in the case of Jupyter Notebooks (https://jupyter.org/).
I have thinking of this related to something else (a truly new terminal, NOT EMULATOR, but the same idea) and the vision is to have a HTML-like markup, that allow to switch widget rendering, so this could have (totally invented):
<codeDom lang="F#> <meta licence=MIT>
<namespace="Utils" file="Utils.fs"> <code> <doc> A minimal print function </doc>
let print x = printfn "%A" x
let blue = <type=Color render="ColorPicker">#00688B</type> </code>
The idea is that the doc is alive. Using static type compiler could annotate the code on the fly, (like note that hex is a color and can be rendered by a cool colorpicker), and could be made to work on dynamic languages.
Or the user annotate the code, if the tool fail.
Or maybe the doc is made from only includes:
<codeDom lang="F#> <meta licence=MIT>
<namespace="Utils" file="Utils.fs"> <code> <doc = "Utils.fs.doc#print"> <src = "Utils.fs"> </code>
So the files are standalone as "usual" and the doc is just the cool metadata. This way, the normal tools will work as expected, yet, we have a way to spice up the code!
The best thing I can recommend is jupyter (formerly IPython Notebook). Click "Try it in your browser" to give it a spin.
It's basically an interactive Python notebook that produces Python kernels on demand ('cells'), and allows you to run each cell individually. This way you can come up with some example code with calculations and run it on the spot to get a result. You can also display charts and graphs, and use markdown language to display equations in a nice format similar to LaTeX.
One of the best features is that you can store them on github and use nbviewer to access your notebooks from your browser. You can view a ton of examples here: https://nbviewer.jupyter.org/github/unpingco/Python-for-Signal-Processing/tree/master/
Of course all of this necessitates you knowing how to code in Python. If you do then it's pretty much unbeatable for technical presentations.
Python. It is extremely easy to install / use. It is easier to migrate to another language at a later stage.
It is also used across a number of professional fields. More so then other languages.
Look at Jupyter notebook if you just want to tinker. It allows you to write your code and documentation together, along with just running segments as you go. There is even a online version if you have issues installing.
Must admit I need to go back and read over the details, but I thoroughly enjoyed the read, thank you!
Have you looked at Jupyer notebooks? I think the presentation of this article/code is absolutely fine but if you haven't tried notebooks, they're an easy way of intertwining code and accompanying text.
There is a bash kernel, for Jupyter. (Formerly IPython).
You can try it here. Click on the new
button in the upper right corner and choose bash
from the drop-down menu.
http://ipython.org/notebook.html https://jupyter.org/
Last I checked there are similar things for other languages (R, Haskell, etc); publishable as web pages as well. I use this all the time for my portfolio work and lesson plans when I taught, it should be easily adaptable for a research notebook
edit: I know you said avoid digital versions, but once I got in the habit of using it I saved a lot of time. At the very least, you can have paper printouts of code segments.
This is a very cool project... That said:
I am wondering as to why you wouldn't use iPython-notebook(Jupyter) to power the same type of functionality. Furthermore, one can Google PCA analysis for iPython and get results like this and Google returns tons of results for Bioinformatics with iPython-notebook.
My intention is not to sour your good work... I am curious how this differs from iPython-Notebook/Jupyter.
If ever you need a terminal based calculator try using Python (start it with python3
) or if you prefer a GUI via Jupyter. I find the terminal program units
(see man units
for more info) useful too, although Plasma will do maths and unit conversions via its KRunner plugins.
>I wanted to hear your opinion about the feasibility of such a thing.
It sounds similar to what Jupyter does; it runs a tiny web server running python and then pops up a browser connecting to localhost:8000.
You could write a program that worked in a similar fashion to Jupyter, or use Jupyter directly. There is a lot of support, and expansion options with remote-hosting servers or Jupyterlab aor Jupyterhub.
This is what Jupyter is https://jupyter.org/ It's a popular tool for programming and data science - I believe that's the term used. While I occasionally use ipython in a terminal, I don't use Jupyter itself. I understand that it's a very useful tool for some tasks and workflows. I often see Python users speaking highly of it. Apparently Jupyter supports different programming languages and not just Python.
I've had nothing but headaches trying to use jupyter notebooks in VSC. Instead what I do is use Jupyter Lab as my IDE for anything Jupyter notebook related and use VSC for anything else.
Check it out here: https://jupyter.org
The requisition does a pretty good job explaining what is required but let me go a little deeper
A testbed role is about sending commands to test the hardware. There might be some connector mate/demate (maybe using a Breakout Box which would require probing the signals and ensuring that the actual output is the expected output) but most of the job will likely be writing a test plan that describes how requirements for the hardware and software will be verified and then creating commands and scripts to do that. Experience or familiarity with Python or a tool like Juptyer will be looked on very favorably.
Hope this helps, and good luck!
I think it is worth going back to basics, uninstalling Python and starting again. Use the update PATH option.
Then, in a cmd.exe environment,
py -m venv venv
venv\Scripts\activate.bat
py -m pip install notebook
jupyter notebook
https://jupyter.readthedocs.io/en/latest/running.html#running
A hardware description, as well as the way users will be expected to use it, would be helpful.
Nvidia GPUs? - Nvidia's NGC catalog of docker images is extremely helpful here - they contain already configured compatible combinations of libraries (cuda/pytorch/etc) and can save weeks of fighting to configure them yourself.
Multiple users working on it at the same time? Jupyter Hub might be useful.
I agree that keeping that LTS OS is a good idea. If/when people think they need a newer version of ubuntu, like some Ubuntu 21.10 dependency, it's possibly they could get by with Ubuntu 21.10 docker images. Or a VM - but that gets painful to share GPU cards with VMs; and painful on your RAM if many people run a full VM at the same time.
You ever hear of Jupyter Notebooks?
https://www.youtube.com/watch?v=jZ952vChhuI
You can run more than just Python in there, but most researchers use Python so that's the focus.
So I could just download jupyter off of (https://jupyter.org/install.html) and I should be able to run it fine? Sorry, not familiar with M1 so I just want to make sure before I do anything.
It can be installed as a regular Python package using: >pip install jupyterlab
Then ran from the command line using the command: > jupyter-lab
Edit: After rereading your post I assume 'for my organization' to mean you need to install it for many users?
Not sure how far along you are, but if you can do some light setup, you could give jupyter notebooks a try. This allows you to write notes alongside runnable snippets of code. It also supports markdown, so you can have nicely formatted links, lists, etc.
Source: My boyfriend and I's Telegram chat data
Tools Used:
Jupyter Notebook for data processing and visualization,
Canva for final presentation creation
Well, two years ago I wrote an interpreter for a subset of Python but that isn't able to run generic Python scripts (or even native modules) so it's probably not want you're looking for.
If you just want to experiment with Python libraries and don't want to learn about Python internals, it might a good idea to look into Jupyter. There's also a Jupyter plugin for VisualStudio code, in case that you're using that IDE for Dart. I think, there's something called JuytherHub to host notebooks which then can be accessed via HTTP, for example from a Flutter app.
“I’d rather have a box with django running all the time...” - you’re supposed to set up venv, then when you’re done developing django components you deploy to a box that runs all the time, if you’re doing data intensive work like text mining you would deploy to a large cluster. Having a venv keeps your component unique on that cluster. This is how Jupyter is designed: https://jupyter.org/
Python is a good choice as it is easy to learn and there are a lot of free resources online. I can highly recommend free courses on: https://www.freecodecamp.org/ And https://jupyter.org/ for coding
Depends on how technical you are. If you need a repeatable process for turning your aggregating your spreadsheets into custom visualizations, you might look into Jupyter Notebooks or Observable
If you want it to be extremely user friendly and interactive, you might be best off building an entirely custom solution for your device/platform of choice.
You could use Jupyter notebook. It is good for this kind of exploration in Data Science/Machine Learning/Deep Learning.
Basically you split your code into parts (code cells), which then can be executed 'separately'.
In your case, put all code with imports at the top part and run it once. Then work with what you need to change and re-run only the latter part(s).
Not sure what kind of computer or Nvidia processor you have, and while it's likely not the issue here, you may want to check out Google Colab for a quick & easy way to get up and running training models in the cloud. They provide a custom hosted version of Jupyter Lab, and access to GPUs for training for free.
After trying this, did you restart the Jupyter server and/or kernel? Alternatively, another way I’ve been able to do this is by using app mode, but like you mentioned, it requires an extension.
I just found an answer to my question - thought I would tell you "problem solved".
https://www.reddit.com/r/quantum/comments/l7ake9/questions_about_qutip_software_on_python/
My bad. Here is the missing reference: https://jupyter.org/ Jupyter notebooks (often referred to as just “Jupyter”) is a form of documentation/writing/computation environment (there is a server which runs a “Jupyter kernel” for some language (originally just Python and then others have been implemented) and a UI that works as an IDE for a document-first experience). The reason I posted my message above is that I would love to see an OS that has an evolved calculator with some mode/layout inspired by Jupyter, probably to be named “notebook” and placed along “basic” and “scientific” as an option in the calculator menu if I was building an OS but I am not a designer. Jupyter is like “literate programming” but interactive and I would like to se an OS calculator that leverage that approach for academic work for both students and teachers because OS calculators are free, but given you make an OS I guess you could make it a stand-alone program and also make it free, anyways that is like 20 convos away. Given the OS can integrate accounts and authentication mechanisms you could enable both tests and homework to be redacted and delivered from the notebook app between teachers and students. With a math calculator and LaTeX should be enough to build a great experience as physics and chemistry can be dissected into those two naturally.
What is a good reason to use python for web scraping?
The primary reason I use python is for web scraping, but I use other tools, so if you're interested in web scraping, check out Scrapy and Jupyter.
A Scrapy tutorial is also good, but it's not exactly clear if it's the best tutorial you'll be using.
I don't know if this is what you're trying for, but you could use Jupyter notebook's to write actually executable notes. Prose in markdown cells, and examples in code cells that can actually run with some dummy data.
> To my knowledge, Jupyter Notebooks don't have [inbuilt collaboration]
You're exactly right, for vanilla notebooks. This is why Jupyter developed JupyterHub .
There are also Google Colabs and Zeppelin notebooks . The former is obviously a Google project, and hosted on the cloud, while the latter doesn't have to be, but has built-in user management, security etc.
One GUI tool I find useful for learning and general experimenting with Python is Jupyter Notebooks. https://jupyter.org/
Usually I start a notebook in the same location as my scripts so that I can still package code in scripts but play with it in the notebook.
Also when I was first learning Python I used Anaconda (which also allows you to install Jupyter), but I've found it easier since to just use the command line. It is very easy once you learn a few commands.
Visual Studio Code (free and open source) is also a fantastic editor for python.
Sure thing.
One last thing: that book shows the Algebra and Calculus techniques using an awesome/powerful Python module called Sympy—it’s different from other Python math libraries, in that it’s solves equations “symbolically.” It can actually solve equations for that, which is helpful for checking your work.
Using Jupyter online, you can perform all your Python calculus experiments in the browser, and the Jupyter notebook will render all of your math equations in standard math notation, just like a textbook.
This site and language are undeniably awesome, but making toy languages for teaching programming seems as counter-productive as making toy languages for teaching grammer.
Just starting folks with bad habits and expectations.
Teach to a range of the most popular and useful languages
Spreadsheet functions, javascript, and python.
Your website puts
To ABSOLUTE shame, it needs a quick loading, quick editable single toy page like yours.
I also automate most things that can be automated. I'm a robotics software developer now, but my last job was hardware R&D and I had a ton of FEA postprocessing and experimental data to process.
>Recently realized how much of my job involves routine data analysis that could be analyzed faster and more thoroughly by a simple program.
Over the years, I've started to collect that kind of analysis (and in my case, plotting) functionality in a small Python package using numpy and matplotlib.
Then I write progress reports in Jupyter notebooks that import my plotting and analysis libraries.
It's easy to re-run the same report with new data, or to start new reports, and making it an installable package gives a single point of responsibility for maintaining the code, so I know all my reports are running the same analysis.
I will say that the initial tooling is time-consuming, even after I had a lot of experience, but if you're always writing the same analysis, absolutely write yourself a little code somewhere to pull the data and generate the plots automatically.
Look at the data science world for tooling. Something that they like to do is use one of these notebooks (also look at R) to write a whole report with text and plotting code that pulls in raw data and eventually distills down to a .docx or .pdf for your boss.
You can even make programmatic changes in the text itself.
I never spent the time to work out how to get a professional nice-looking PDF out of Jupyter, my boss was okay with HTML when he actually read 'em. And going all the way to "a report that writes itself" is probably not justified in most engineering workflows. But a plotting/analysis library system that lets you pull and transform raw data and make a commonly used plot in a few lines of code is very nice.
I know this is not what you are looking for as it ia a bit far away from "simple", but I think a project like this is not used widely anymore is because people do their annotated computations on platforms like Jupyter Notebook and nteract where they have the power of a full-fledged programming language, countless libraries, but can keep it simple if that's all they need, and of course still able to annotate their computations as they go. I know a lot of scientific research projects use these to keep track of the steps.
Just thought I'd share this info.
Jupyter notebooks allow you to combine documentation and code into an interactive document. You can put in markdown and then put in code examples that you can run in the document: https://jupyter.org/
There is a PowerShell kernel so you can make interactive PS docs.
I found links in your comment that were not hyperlinked:
I did the honors for you.
^delete ^| ^information ^| ^<3
con "como R studio" te refieres a que tenga como una zona dedicada para ver los gráficos y cosas así? (la verdad debo haber usado R studio menos de 10 veces en mi vida).
Busca Jupyter si es que sí, de hecho lo puedes probar acá https://jupyter.org/try (edit: el "classic notebook" o el "jupyterLab"
I found links in your comment that were not hyperlinked:
I did the honors for you.
^delete ^| ^information ^| ^<3
https://www.edx.org/course/introduction-to-computer-science-and-programming-7
https://automatetheboringstuff.com/
For data science, learn how to use Jupiter notebooks and do some data analysis: https://jupyter.org/
I'd say mostly no. However there are a handful of folks interested in moving away from static slides and using tools like Jypiter Notebooks. Not enough to get traction, I'd say. The training department would be more interested in offering workshops on JNs if they had the tech skills to teach it, I suppose.
>using Pkg
>
>Pkg.add("Genie")
Thanks, that worked to install the package.
But then when I proceeded to julia> using Genie.Router
I got "UndefVarError: julia not defined" again.
Edit- I know I'm probably missing something incredibly obvious and should probably just learn more Julia syntax. This is what I'm using, if it's relevant.
Those are just cell numbers from a Jupyter Notebook. You don't need to worry about them, though you might want to check out Jupyter if you aren't familiar, they fill a similar role to the Python REPL (Read, Execute, Print Loop) and can be nice to use for creating reports and light prototyping.
Wouldn't you have to replace Jupyter Notebook with orgmode to get those benefits? Is it superior? I've seen Emacs do some nice ascii art (artist-mode), but can it do stuff like graphs and charts?
> To me it seems like with everything moving to cloud based services learning things me tkinter or pyqt won’t be as popular as something like flask or django.
This is absolutely, 100% the case. It's so much the case, that even major desktop tools sometimes use a browser interface. So if using the browser is feasible and convenient for your end users, then that approach is completely feasible.
Not necessary but if you want to play with vms, more RAM is recommendable and these days RAM is affordable.
About NAS, depending of specs maybe a cpu swap could be doable. A 130W cpu is too much if you don't have too much users in your nas. I went from dual xeon e5620 (80w each) to dual L5630 (40W each).
About interests, make a search with WIYH (What's in your homelab). You can see lots of info there. Mine right now is deploy a Kubernetes cluster with 4 raspberry pi I have and have some documentation app like Jupyter, git, whatever.... even Wordpress to store cheats, information, etc from my homelab and interests.
Also take a look at wiki, there are tons of info there, maybe a little old but ok
I forget to add Ansible to help me in my lab.
I would set up one of the open source python runbook platforms, Jupyter being the most popular I believe. It sounds like this is really the use case these are made for. https://jupyter.org/
or what about Azure notebooks (which I believe itself is a variant of Jupyter)? https://notebooks.azure.com/ They specifically advertise it's free... not even that it's part of some free tier, but free...
Are you going to use jupyter notebooks?
Install visual studio code, and jupyter ( pip install jupyterlab ) and run your jupyternotebooks on visual studio with all of it's adventages.
If you want jupyternotebooks by all means...
Ditch Anaconda for a bit and just get a standard Python environment running.
Once that is working well you can install jupyter on it using:
py -m pip install jupyterlab
Run it using
py -m jupyter lab
Or install notebook instead of lab. Stick with the py -m
rather than using pip
alone as that can have version and path issues.
Hey man, I quit WoW over the Summer and I happen to be learning SQL and Python right now. This is a pretty good resource for basic SQL: https://www.w3schools.com/sql/
For Python, I really like Jupyter Notebook: https://jupyter.org/install
Let me know if you’d like some SQL practice questions — I’m taking an intro to databases course atm.
God speed!
Ok a few things.
Why are you using interactive python? You should be doing this in a .py file and running it from that.
You cannot use a return statement outside of a function.
questions = ['what is' ...]
is indented but it isn't inside your for loop, nor is in the if block, nor is it in a function. Indentation matters in python.
normalized_title = submission.title.lower()
submission doesn't exist here, that's why it's telling you it's not defined. This needs to be part of your for submission in subreddit.stream.submissions():
block.
​
I would recommend you
I would delete that image, your masking of your name and reddit api keys is transparent I can see the names/values on my monitor at full brightness.
​
If you are absolutely dead set on using interactive python install Jupyter notebook
If you data matches the numpy array features, I can recommend it.
If your data is more than pure numbers and you need indexing, relations, stats., storage options, etc. then pandas is really effective.
Edit: pull them all together to try them out in jupyter: https://jupyter.org/