Sorry it was unclear. Our frequent readers know of our API directory, which was started in 2005 and now has over 3,500 APIs. We're sourcing the data from our own research in our publicly accessible directory of APIs.
All XML APIs: http://www.programmableweb.com/apis/directory/?format=XML
All JSON APIs: http://www.programmableweb.com/apis/directory/?format=JSON
It's fun to automate things.
The cool thing about online services, is that most of them have APIs!
So automate something dumb and fun, using Python as simple glue:
Write a script that sends you an email every time something specific is mentioned in your Twitter feed
Hit the Reddit API, grab the top 10 daily hottest youtube links from /r/<music_genre_you_like>, then hit Youtube and download MP3s of the linked songs, or parse the Artist and Song Title from the string and add 'em to a Spotify playlist or something
Poll Fitbit daily to add your nightly sleep stats to a Google spreadsheet or Evernote table
Poll Facebook API every 10 minutes, and send someone specific a random message over Messenger every time they log in
Write a bot for Twillio to fuck with your friends. Aim to pass a simple Turing test, at least for a few minutes
Use Google Maps to plot your photographed bird sightings to a map using photo geo data (i.e., iterate all photos in a Amazon Cloud Drive photo gallery named "Bird Club", and add new ones' plots to a Google Map, save said map to PDF using wkhtml2pdf, and upload it Dropbox)
These are probably very dumb, but you get the gist.
They removed it so they could stop paying Microsoft money.
They then wrote push into the gmail api. Apple is the one who is not supporting it.
Blame Apple.
Edit: for everyone that can't use google and is STILL pushing the anti-Google agenda: http://www.programmableweb.com/news/gmail-api-adds-push-notification-support/2015/06/03
Google wrote a push protocol into their gmail api so they wouldn't have to pay Microsoft, which is a competitor. Apple needs to utilize this. It is no longer googles fault. It is apples fault.
Edit 2: another link https://www.reddit.com/r/apple/comments/37tr55/the_official_gmail_api_finally_gets_push/
Please don't take this the wrong way, but you really don't have a portfolio. Three of your repos are simply exercises from books, dotfiles aren't usually something to show off (although it's certainly useful to keep them on GitHub - I do so as well), and pig latin converters and vowel counters are freshman assignments.
Try to build a larger project that you think people could actually use. A simple game, a wrapper for an API (look here), or just a pretty personal website where you show your projects instead of simply linking to a GitHub profile.
When I was searching for a side project I generally went through websites like http://www.programmableweb.com and searched for cool APIs I can use without much fuss. I recommend you do the same.
Also, don't limit yourself to 1 API. Sometimes a good app can be one that combines a couple different APIs to offer a unique service.
To get the results, you would either need to collect (scrape) them from a web page that shows them, and then save that somewhere your web application can access it, or you need to use an already existing API that has the scores in an easily readable format for your programs (JSON).
Check these: http://www.programmableweb.com/category/sports/apis?category=20016&keyword=nba
For updating a web page after it is loaded, you do AJAX calls with javascript. Do this easily with a framework like jQuery for instance. A tutorial on AJAX calls with jQuery here, for instance: https://www.codeschool.com/courses/jquery-the-return-flight
Thank you, reddit is freaking huge, it is used by different users in different ways, there no possible way handful apps can respond to demands, if you look at r/movies it has 8,303,189 members, count related subs (media)... that's extremely good sum of users, an application that responds well to those type of subs can be very successful, now there's sport, news, finance all kind stuff 65k can't be enough for everything ;)
They removed it so they could stop paying Microsoft money.
They then wrote push into the gmail api. Apple is the one who is not supporting it.
Blame Apple.
Google wrote a push protocol into their gmail api so they wouldn't have to pay Microsoft, which is a competitor. Apple needs to utilize this. It is no longer googles fault. It is apples fault.
http://www.programmableweb.com/news/gmail-api-adds-push-notification-support/2015/06/03
I did some freelance work, which mainly amounted to simple, static websites. It was definitely beneficial to have some work for paying customers in my portfolio, but I knew I'd need some more technically impressive work.
For personal projects, I kept a running list of ideas I had that were interesting to me, and would also require more work (especially JavaScript). I made a couple that connected to external APIs (ProgrammableWeb has a great list of these), as well as some others that just needed a bunch of DOM manipulation or something else needing JavaScript. One in particular I chose to do without JQuery, even thought it would have been much easier, as an exercise to show I knew vanilla JS.
In my experience, employers really liked that I chose to do things that actually interested me for personal projects. I was told multiple times that it showed a passion for my work.
You're scraping the page html after fetching it with requests, and parsing with beautifulsoup.
I can understand this approach if you're scraping a wide variety of sites for certain types of content, so say if you made a web crawler for all types of comics from many different sites. However you're only targeting one site they will usually have an API which will serve the information you need in an easy-to-read format.
One downside of web scraping is that they can be quite fragile. In your case you're searching for images inside an element that looks like <div id="comic">
, so if xkcd ever revamp their site and that element changes in the future, your scraper will stop working.
But still, building a web scraper in Python is a fine educational experience. If you want to take that kind of thing further look into Selenium. The requests package that you're using only receives html directly from the server, it doesn't emulate Javascript which makes it impossible to scrape sites that use ajax calls to render their content. Selenium allows you to emulate a headless browser which you can then use to fully render and interact programmatically with a web page, or take screenshots.
They use the destiny API. Probably the best place to start is the bungie.net group forum:
Bungie API on Programmable Web
EDIT: adding link
MyEpisodes.com uses TV Rage's API to get their data. It's freely available here.
Programmable Web is a pretty good resource. You'll find information about plenty of APIs.
the best thing to do would be to use something like Pandora to play the music and just get the song info off their service. There is an unofficial API for Pandora, I have no idea if it works well http://www.programmableweb.com/api/unofficial-pandora
You can use google docs (that has collaborative editing), now implemented in g.drive, and use addons/plugins like ShiftEdit, Neutron drive,.. Here is a list with other IDE's http://www.programmableweb.com/news/12-cloud-based-ides-boost-productivity-roi/analysis/2013/12/23
also there is Koding, a free cloud based tool for editing
forgot to reply to this post, so here it is a week later! I havn't personally used D3 before, but it's recommended a lot so it's good that you have that under your belt.
Twitter's API can give you search results or an individual person's timeline. The first step is to think about what you want to do with data. For one of my projects, I was curious about what social media thought of certain phone products as I was in the middle of searching for a new phone to buy. I used twitter's search api to get 10 pages of results for 4 particular phones (nexus 5, samsung galaxy s5, htc one m8, iphone 5s). With all the tweet data, I performed something called sentiment analysis (scoring of how positive or negative a sentence is in tone) and calculated a basic score for each phone. This helped me gain some insight into which phones social media responds better to.
That's just one example of how to use twitter data. I think there's more interesting things to do, you'll just have to come up with something, or use my method but for products that interests you. If you like video games, I know league of legends has an api, so does valve for dota2. Youtube and most other popular websites that get a lot of user traffic have multiple api's.
Here's a list i found with a quick search: http://www.programmableweb.com/news/most-popular-apis-least-one-will-surprise-you/2014/01/23
I wrote this post and found more useful related articles.
This first one touches on the points I made with additional insight and tips.
http://www.entrepreneur.com/article/233146
Lists are always useful. The 7th tip is similar to my advice to avoid working on someone else's product.
http://www.programmableweb.com/news/ten-protips-avoiding-hackathon-fail/2012/03/22
One point I did not make because it was not really part of the points I was looking to make was the importance of Internet access. WiFi can often be a problem at a Hackathon. For the one I attended 2 months ago everyone overloaded the WiFi and half of my team was offline most of the time. This was not good since we were setting up a client/server system and a stable network was important. We left the building where the Hackathon was being held and went to a nearby coffee shop where I knew there was great WiFi. There were Ethernet ports so next time I will be sure to bring a WiFi hotspot so my team can simply join that one hotspot. Even if we just work locally we can still be productive. We lost so much time from being offline.
I'd also encourage people who are planning on attending a Hackathon to get to know others who are going to be attending the Hackathon and also recruit others they already know to attend, even if you do not work with them on your team. Having familiar faces around is a big help. I helped people on other teams at the last Hackathon I attended because our focus was on learning and building something cool. We were not as concerned about winning the prize.
It seems like it would be reasonably easy to write a script to take the top n most common names, perform image searches for them, and count the pixels that are flesh-toned in the top x images, then sort the list by number of flesh-toned pixels...It wouldn't be perfect, but it'd get you close. I'm sure there's a "detect skin" algorithm out there you could plug in.
It looks like several of the big search engines provide an image search API - but I'm a little busy tonight.
Not to sound condensending but you will more than likely have to 'borrow' the feeds from a flashscore.com or some other site that has scores that are updated regularly.
If there are any avaialiable that are still free, I would check here at the Programmable Web
Most of this stuff is protected behind APIs for money, but I've heard that NOAA has a pretty good API. http://www.programmableweb.com/api/noaa-national-weather-service-nws
What do you mean "affected" by storms? Just a list of zip codes involved in various storm events, like Hurricane Sandy or something?
That might require some manual work.
Here is a list of Booking APIs, some of which include hotels. http://www.programmableweb.com/news/44-booking-apis-rezgo-tourcms-and-travelport/2012/08/21
I don't know what you are planning to do with the API, but most of them seem to be aimed at businesses selling hotel rooms, and expect you to become a partner with them :(
Well, it will definitely be easier to use a lyrics API instead of scraping terribly formatted websites yourself. Here is a list of some lyrics API's. Scraping websites can be a pain and if you're wanting to do multiple, your code is going to need to be maintained much more frequently as the lyric website developers update their site. For that reason, I strongly recommend using someone's API.
Seeing as you're new to coding, an API (application program interface) is a list routines/methods someone has developed for you to more-easily interact with their data/program/etc...
I'm not sure how you're going to determine what songs to pull and compare as it almost sounds like you want to do it with every song. The more you narrow the scope of those comparisons, the easier it will become.
For counting syllabols I will point you to THIS and THIS as syllables aren't as straight forward as things like total word count and letter count.
Ultimately, I think it's great you want to learn but think you need to take it a piece at a time. Start with building a Desktop application that can scrape lyrics. Then build a means to store and parse syllables. After that you can bring it together, build it somewhere you can access it from multiple devices (web projects are accessible on all platforms).
Good luck.
Another good collection for APIs is http://www.programmableweb.com/. At the moment, my favorite API has to be twitter's, but that's because I've found ways to tap into it for multiple different projects I'm working on.
NAL: I looked at the Terms Of Service for ESPN and it would be against their Terms Of Service. You can always ask them for permission.
I would do a search for a company that offers a API or a database for what you want.
Edit: Here are some places where you can start looking
http://www.programmableweb.com/news/91-sports-apis-fanfeedr-seatwave-and-espn/2012/08/01
http://www.quora.com/What-options-are-there-for-streaming-sports-stats-APIs
http://stackoverflow.com/questions/11899792/whats-a-good-api-for-sports-stats
Consuming a REST API (or just a web API in general) can be fun, and it's almost certainly a skill you're going to need at some point.
Just pick an API you think would be fun to work with and build an app around it.
The main site people usually point to is: http://www.programmableweb.com/
This is a good site to read if you are the entrepreneurial breed of web developer but I haven't actually got a single mash up idea from this site. It's a little dry. It is more a PR service for companies launching API's.
The buzz about mashups beginning of last year was big and there were a few awards programs and such that you can find lists of some interesting ones although for the most part the winners are made by fully fledged development teams showing off but a few made the cut from single developers with cool ideas.
Product hunt and hacker news are good places to get ideas although you have to read a lot of crap to get to a small nugget of something good.
Right now my fun project is playing around with import.io which I discovered by reading this sub. This type of service has moved out of the shadows and has gone mainstream. I have been writing my own scrapers for years so I can definitely see some ways that this could be disruptive to the market and force a few law changes but I can also see some crazy potential. I think that's a company that will either do great or fall flat on their face.
I find it quite a fun past time looking at programming libraries and seeing what they can do so generally I just wonder if something exists and Google it and usually it does. I usually do this when I should be working but I don't want to go the full slacker route and read reddit.
I see a lot of old forgotten opensource projects getting revived recently I guess due to the software boom or whatever you want to call it. Catching them at the right time can give you something new that people haven't seen before.
Anyway, I probably have about 3 ideas a day for mashups so pm me your skype id or something if you want to chat or work on some code.
I started working with open APIs, retrieving, analyzing, storing, and outputting interesting data sets.
Afterwards and Along the way, in no particular order:
A lot of services, like Twitter or Reddit, provide an API, to fetch data and/or interact with said services.
For example, you could build a webapp that draws a graph to show the most used subreddits for a given username, the most active hours etc.
QR code posted in taxi/buses to link to website/app.
App/site should have a tour section with places to see, restaurants and things. Optionally show prices to each destination. You can lean on yelp's api for this.
Pull a local event feed. http://www.programmableweb.com/news/93-events-apis-eventful-upcoming-google-calendar/2012/05/01 Eventful will probably work for this.
I would suggest making the focus less on his business and more on general tourism to increase usage. He will make more money by driving people more places without needing to specifically advertise himself.
Todo list, a tiny social network, or perhaps solve a problem that you find interesting.
Incorporate some REST apis into your project. Heck, looking through this list may even bring some inspiration for your project itself: http://www.programmableweb.com/apis/directory
Looks like they recently made changes to their API. Could be related? Hopefully its an internal dev, deploy or hosting problem and not an external attack.
No news from their twitter feed tho :-S can't believe it, I got held up so didnt get a chance to move the coins I'd bought just before it went down. Fuck!
Edit - sorry for massive URL, on phone.
No worries. I would suggest checking Programmable Web before search engines. And usually any site with a public API will link to it on the bottom of the home page with the label "API" or "Developers". Of course, maybe they didn't when you started.
i'll have to dig around but i think i recall something about them providing it for developers or something. it was my under standing from a meeting with hop-stop that they and others like them used the same api that was supposedly allowing hop-stop to real time it's directions.
ninja edit that was easy, supposedly it's fast being updated every 5 mins in terms of service delays and stuff.
Skip the key. "No API key required if you make fewer than 10,000 calls per day." If you're just experimenting, I doubt you'll be making 10,000 calls per day.
You're going to wanna look into the RedditAPI
Gonna wanna read up on REST Web Services
Also might wanna take a good look at JSON
This should get you started.
For sure, I'm super into machine learning and recommender systems, my research is on social network and mapping the organizational structure of user inputted data, so I'm really interested in how non-tech communities produce data.
You may not know this, but Ravelry is THE site for knitters. (I also knit), and as fate and luck would have it, they have an API I don't know how much you know about recommenders, but this specifically is an awesome data set because they use both tags and star ratings which means you can build really awesome models of interest on a user to user, as well as a system wide basis.
The rest is just toying with different statistical models, figuring out what works and hooking single user preferences in to try to predict what they're interested in. You will have to do some training, but for me that's part of the fun. Mining the Social Web is a great resource for learning more about these sorts of projects. Hope that helps!
It's explained in more depth in the first article in the series:
> Green Button, a secure way to communicate energy usage information electronically using standardized RESTful API Web services and a common data format.
Blizzard has an API. If you can learn how to call that API in your expirements, then that might do it.
AFAIK, all major trackers use the blizzard API to get their datas.
http://www.programmableweb.com/api/battlenet
EDIT: http://us.battle.net/forums/en/bnet/topic/20744814532
There is apparently no official API and they seem to be using scrapers or something else. It's hard to be sure at this point. WOW, D3 & SC2 all have official API classes, but OW does not.
There may be other sources for information about the api, but this is one i found on a quick google search.
I found these:
http://www.programmableweb.com/api/google-chromecast/sample-source-code
Here's a guy on reddit a while back that wrote his own:
https://www.reddit.com/r/Chromecast/comments/1xgte4/open_source_local_media_server_for_chromecast/?utm_source=amp_share&utm_medium=tweet
EDIT: What about google's github?
https://github.com/googlecast/
I'm unclear as well. I would say pretty much everyone but you has access to this information since most of it is well documented and open source.
Facebook's excellent GraphQL server/implementation (used to power their next gen server data API):
https://github.com/graphql/express-graphql
Netflix does plenty with node, but here's a great article about some of their optimizations that leads with the first line (visible with the google search "netflix using nodej"
>We’ve been busy building our next-generation Netflix.com web application using Node.js.
http://techblog.netflix.com/2014/11/nodejs-in-flames.html
And microsoft has even forked nodejs to use their Chakra runtime so that nodejs can power the backend of their universal windows platform (if necessary):
I don't mean to feed the trolls, but hopefully this information will be interesting to any future nerds reading the thread!
There is a free API for zip code distance from another provider: http://www.zipcodeapi.com/index.php
Also, Programmable Web is a site where they list web apis. Here's the search result for "zip code distance".
Hello! The only data that's relevant is from the Australian Bureau of Meteorology (BoM): http://www.bom.gov.au
There are a few Pebble apps that use BoM for tides and charts and one watch face that shows temp. Maybe these links will help or maybe put you off:
http://www.programmableweb.com/api/australian-bureau-meteorology
http://www.rogerclark.net/json-data-from-www-bom-gov-au-a-work-in-progress/
If you can include BoM you will be my new hero!
These MERN, MEAN and other such acronymns make things much more complicated and confusing for learners than they need to be, and as far as I can tell they're just products of MongoDB marketing trying to get you to use their database which doesn't reliably store your data.
Also MERN isn't an MVC framework in the traditional sense, because it spans both the server and the browser. So you've got an application on the server spitting out JSON, and you've got another application in the browser consuming JSON and presenting it, with its own logic. Essentially you're building two separate applications.
If you want MVC I'd recommend sticking to server-side templating in Node, and just send HTML to the browser and treat the browser as the View layer... i.e. do things the way things have always been done forever.
Alternatively you can just stick to building the browser-side application by skipping the whole backend entirely and just consume a pre-existing public API such as can be found at http://www.programmableweb.com/
Nothing wrong with innovating: some of best ideas have appeared out of nowhere (apparently). But -- more often -- creation is incremental and opportunistic: two ideas from different realms melded together hit the right space at the right time.
Facebook wasn't anything new -- MySpace had been around a while -- but the concept of using colleges (and, initially, exclusivity) as a vector for spreading adoption paid off for Mark Zuckerberg. People had built cars before Henry Ford did, and had assembly lines: he put those both together with a reasonable daily wage to leapfrog everyone else. Photography had been around for a hundred years before Edwin Land hit upon the idea of putting the development lab inside the camera -- and inventing the Polaroid camera.
So that's why I mentioned taking something that's known to work and putting it into a different context. That still counts as innovation: the trick is to get there before anyone else! Check out a site called Programmable Web -- they are all about mashing up APIs from different sites to make new things...
Keep in mind with CNN you are starting 15 minutes behind.
Take a look at this ( http://www.programmableweb.com/api/nyxdata ) and look at all the related APIs below it.
If you want to find good APIs for just about anything, Programmableweb.com is the place to find it. They list every Government API, and just about any public API out there.
You mean the one pointed to from Programmable Web?
http://dev.plexapp.com redirects straight to http://plex.tv with no explanation. Is there another url I can look at?
Just create a ssl socket for the server. It's not hard. There are many tutorials. If you need help, just hit me.
You could look into netflix network traffic with wireshark or similiar. Maybe you can get some framework how they are doing it.
They even have an api to use ( http://www.programmableweb.com/api/netflix )
I think that would be the solution for you. Just try to make simple Emulator for you needed stuff.
If you're comfortable with tinkering, you can use Apple's built-in automator to do this.
Dropbox ~~offers~~ offered a Sync API which you can use, but was deprecated beginning in Oct 2015. Support will be fully dropped by April 2016.
Here is a list of some storage APIs that can be used.
One easy solution would be BackBlaze. I believe you can set it to remote backup, similar to dropbox, but I've not looked into it too much.
Your NAS may have proprietary sync software, e.g. Synology has their app, Cloud Station which functions similarly to Dropbox and syncs files over devices.
Best place to look for APIs is http://www.programmableweb.com/ and there's nothing on there about Public Alerts so I would be 98% sure there is not. It does list hundreds of Google APIs, but not this one.
Not sure about the 403, maybe Internet crapped out ?! But the second part, yeah, Matias contacted me so I debugged it further but I couldn't figure out how to work around it. So the problem is the script is suppose to be looking at the entire home page of Learn when it is scraping for your course links but for some reason it only takes links from the News section. So it would find courses which posted news recently but not other ones. Maybe a workaround would be to look at this pagehttps://learn.uwaterloo.ca/d2l/lms/news/main.d2l?ou=6606 (with 200 per a page) instead of the home page when scraping course links and then there would be at least one news posting from all your courses. This would be a quick hacky fix. It would be cool if we could re-program it entirely using this desire2learn API though http://www.programmableweb.com/api/desire2learn. See I haven't used it though/didn't look into it extensively so I'm not sure if d2l made it so any one can use it or you have to be a learn administrator etc.
Not sure what you've learned, or what you need to apply to the project, but peruse programable web's huge list of public API's for ideas - http://www.programmableweb.com/apis/directory
There might even be one there to pull in your football league data.
you can reverse engineer enough of tinder's api without being any kind of uber coder. (case in point) And spintax's been around for a long time.
GIF's and Pictures are about as relevant to the issue as hair gel to a bald dude.
I don't know how much searching you've done, but the first thing that comes to my mind is Spotify's API
https://developer.spotify.com/web-api/get-album/
That returns an album which has a 'popularity' value tied to it.
If that doesn't do it for you, you can take a look at this list of public music/media APIs. I don't know if any of these return album sales, but it might point you in the right direction.
http://www.programmableweb.com/category/music/apis?category=19990
There are several ways this can be done. The easiest would be to find a webservice that publishes stock prices. They come in a variety of flavors usually soap or rest. There are various libraries that can handle these webservices.
The other way is to send a http request directly to a website. Basically what your browser does. And then take the html appart yourself. This is a pain in the ass but works for sites not offering webservices.
Ill add some links when im not on mobile anymore.
Edit 0: some quick google foo http://www.programmableweb.com/news/96-stocks-apis-bloomberg-nasdaq-and-etrade/2013/05/22
Edit 1: I'm not webdev at all so somebody might know much more about state-of-the-art. but if i remember correctly appache jersey is pretty good for REST-ful services https://jersey.java.net It'll also get you started on maven which is very handy (If you use maven appache commons and guava are wonderful libraries)
As I was saying, he doesn't need to hit a lot of sites if he is using an API... He just needs to make calls to APIs that will deliver the goods.
http://www.programmableweb.com/category/News%20Services/apis?category=20250
Its a fairly easy to do, I mean, pick one and just have it spit out news, if you want have it targeting multiple apis - you're not trawling manually, you've got code that will do it for you. Sure there is going to be more code involved but that is what development is all about!
A common variation these days is RESTful API. You use HTTP to talk to those, similar to how a browser loads a message or sends a form, so most any programming language can use a service. A good list can be found here: http://www.programmableweb.com/apis/directory
you're not gonna get in to HFT unless you can afford about $10k per day:
with that said, you can do algorithmic trading, just don't confuse this for HFT
so, algorithmic trading, sure... HFT? not so much
with all THAT said, what you want is API access. lots of places have it
http://www.programmableweb.com/news/96-stocks-apis-bloomberg-nasdaq-and-etrade/2013/05/22
so that's where you want to start
also, if you consider yourself an "aspiring" programmer, you should probably invest some time in learning more about programming and specifically studying concurrency, architecture, system administration and scaling
also, learn the basics of finance and financial instruments. This means stuff like insurance (yes it's boring but it's one of the most important fundamentals off of which everything else is based), bonds, standard deviation, etc etc
True, but it's rare you're going to find data in the exact format you need. Even if you get data from an API you're probably going to need to do some cleaning. But if it's APIs you're after.
I'd definitely try it out if the YR.no api was implemented. It is very good not only for norway, but also rest of the world. Let me know what you think.
I understand you guys are looking for first adopters, but could I have some examples of things I CAN search on the site? I get "Sorry, no APIs were found." for everything I search.
I currently use http://www.programmableweb.com/category/all/apis to find stuff, but if you guys made a fully integrated version of that site I would totally use it.
Zillow's got an API for demographics. http://www.programmableweb.com/news/zillow-adds-neighborhood-getdemographics-api/brief/2015/03/04
Again, I found it, but haven't actually looked at the data to see if it's worth having on a city dashboard.
Also, the UK seems to really have their act together re: city statistics. Check out this resource. Very Nice. https://www.citycontext.com/
It's against CL terms of service to use any automation for creating listings, I don't think you'll get in trouble for typical get requests unless you're sending them too fast.
Per their TOS:
"It is expressly prohibited to post content to craigslist using any automated means. Users must post all content personally and manually through all steps of the posting process. It is also expressly prohibited for any user to develop, offer, market, sell, distribute or provide an automated means to perform any step of the posting process (in whole or in part). Any user who develops, offers, markets, sells, distributes or provides an automated means to perform any step of the posting process (in whole or in part) shall be responsible and liable to CL for each instance of access to craigslist (by any user or other third party) using that automated means."
Try : this service, it may be easier than writing your own crawler.
Disclaimer : I've never used that service, but I have tried to automate CL functionality before
They will be told. By the Supreme Court. At least I hope common sense will prevail.
It's pretty hard when you're just starting out, because too often you're building sites just as exercises or just to teach yourself, and it's not as rewarding. My recommendation is to think of the sites you want to build later on, and start building them with the skills you have now. But instead of puling data from a database, you should just hard code everything (basically prototyping without backend functionality). You'll eventually repackage it into smaller bits, but it'll help to have the portions of the modules there.
One way to build new sites is to hop on http://www.programmableweb.com, choose the newest API published, and build a site completely based off of that API. It might also be helpful to learn how to use OAuth early so that's not a barrier later on. But if you're not at the API level, again, just hard code everything.
Building the "obligatory blog" can get boring after a while, so just make sure to mix it up.
You can also user http://itsthisforthat.com/ for ideas to build joke sites.
In no particular order: * Leftronic * Ducksboard * Geckoboard * Cyfe * Klipfolio
Here's a decent comparison between some of those.
See also the Programmableweb article "Block.io Aims to Disrupt Bitcoin With Service Free Wallet Api".
Thx for the info. I'm a bit skeptical wrt that, specially wrt integrating Javascript with stuff like GPU processing (e.g. CUDA). However, in terms of packages and catching up, it seems to be displaying a very "rapid growth".
PD: update. Node.js may be the future, I was going to mention that a language like Scala is nicer due to type-checking, but this time I decided to search before. Apparently, anything I can think about is already there. Node.js may be the present.
Just to add to the above post (dashing is awesome). Here are some SaaS ones:
I would check out DCRUG or Arlington RUG (Ruby Users Group) http://www.meetup.com/dcruby/
http://www.meetup.com/Arlington-Ruby/
I know the Arlington Ruby guys are really helpful and there are a good amount of newbies who go there just to learn more about Ruby.
I have not been to the DCRUG but I'd imagine they have a similar vibe to Arlington RUG.
As far as how to learn...like others have said you don't need a class. I am a self taught programmer who got a programming job out of college where I didn't take any programming classes.
You don't need classes to learn to be a great programmer. You need a project. Think of something you want built. It can be an existing site that you want to recreate on your own; or something completely new and just get coding. Start with a hello world tutorial and then slowly expand on that until you have a site hitting a database backend that is doing what you want.
Some common simple projects are:
Basically, the best way to learn is to just do it. And the best part is that you won't get bored because you picked the project that you are working on...not some teacher.
I found some service called AQL. I haven't used it before, but it doesn't mention using a cell anywhere. It might be what you're looking for. I'd look up some reviews on the software to make sure it's still being supported.
In searching for alternatives, I found a list of SMS APIs. These will all be much more developer-oriented than you're likely searching for, but some might have polished software you could use.
You can always query the Google Maps route api via webview/javascript and parse the result. I'm never quite sure if that violates their terms of use or not but since you are using it with a MapView you should* (not legal advice) be ok.
So,
Instantiate a webview and inject then javascript (from /assets or as a string should work).
Run a javascript interface to call back into your activity with the results.
Parse and build the line segments.
Create an overlay on the map and override the draw. Remember if it is a route with many segments you may want to clean out small segments.
That way you should be able to get both a route and text directions.
The other options is pull it from another service. Here is a short list;
http://www.programmableweb.com/apis/directory/1?apicat=Mapping
Code something useful. I'm all for reading up and practicing exercises out of books, but it has its limits.
If you're struggling for inspiration try going here: http://www.programmableweb.com
Find an API you like the look of and see if it has a Python wrapper here: http://pypi.python.org/pypi
If it has a wrapper - use it. If not - make a wrapper using wrappers from other API's as inspiration. Learning by example. Release it on Github and PyPI, opening yourself up to constructive criticism.
Think of ways of joining up some of those APIs. Transferring data from one to another. Youtube favourites -> Delicious (for example).
That way you're making something useful, using and working with existing code that's often well developed and its far closer to real world problems than learning how to slice a list (you'll learn that as you go along).