How much time is left for the start of Brazil 2014 cup? If you want know the answer, just give an eye to www.cup2014countdown.com
UPDATE: I had to power off meteosemplice.com, because i was doing too many calls at the api web site. Any suggestion for a cheap and affordable api service about meteo?
Another django site is online: meteosemplice.com. It’s a nice web site where you can find all about forecast weather in more or less every city of the world.
As usually i started to develop this site to learn new technologies, this time was about GIS, postGIS and OpenWeatherMap API, but at the end it’s become an useful site (i hope), so now it’s become a nice site with a clear interface and i hope it will be useful. So if you want give an eye just point your browser url at meteosemplice.com and start to forecast your weather!
UPDATE: This article is deprecated. Twitter has retired the API 1. With API 1.1. it’s mandatory an Oauth Auth, so the better way is build a server side proxy to get the json data.
So, i’m really a newbye with octopress, i’ve started to use it more or less 2 month ago, but i really like it. It’s fast, it’s simple, it’s easy, the only thing that i disliked when i started to use it is that the Twitter Aside does not worked for me, it hangs on “Twitter’s busted” and nothing happens after this.
This is the original code:
1 2 3 4 5 6 7 8 9
and this is how i managed to make it working
1 2 3 4 5 6 7 8 9
It works! so now i’m an happy octopress twittered blogger
Well 1st of all my problem: more or less one year ago i wrote a nice python script, a shell script that every night downloads datas from my analytics account and save them to a database. It’s useful for my job because i can use this data to better analyze how my business web sites are going.
Now, more or less a week ago my script stopped to work. I had an error 401, limit rate exceeded you need to authenticate…
But my script was authenticated! Yes i used an old gdata api and a normal old ClientLogin API authorization protocol but it worked till last week.
Anyway i decided to switch to the Oauth 2.0 Authentication. There is a ton of documentation about how Oauth 2.0 works and there is a ton of documentation about how a Web Application and a Desktop Application can use Oauth 2.0 to be authenticated with Google API but there is really poor documentation about how a server side script can authenticate itself with google without an human presence.
Well , 1st of all: i’m lazy, i’m really a lazy man, so lazy that i’m a developer. And i’m a developer because i can automate most of my jobs.
Now, the picture is i’ve to manage a couple of servers using ssh. Till now nothing of really hard, yes i’ve to use my keyboard and write something like “ssh bla bla bla”. It’s an hard job but i can do it.
Problem: my servers work behind a vpn. So i have to open my vpn client and i’ve to connect to the vpn server, after this i can start to type “ssh bla bla bla”.
Ok i’ve viscosity, it’s a really nice vpn client, and it need just a couple of clicks to connect my vpn server, but it’s too much for me!!! I can’t survive to the effort.
So i have wrote this little applescript code:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
Looks like easy enough :–) yes you have to substitute “nameofyourviscosityconnection” with the name of your viscosity connection and “bla bla bla” with your vpn ip, but i’m sure that you can do it.
The script is self explanatory, it checks if your vpn is connected, if no it open a connection. If a terminal window is open the script use it to run an ssh command, if not, it open a terminal window and run the ssh command.
now, you can open your ~/.bash_profile and add an alias
and this is all.
Next time, you have to write in your terminal “goservergo” and you will be connected.
Well “another heroku site is online” or better “another octopress web site run now on heroku”.
I just switched jaco.it to heroku and i’ve to say that it was a piece of cake.
Yes, i know that there is nothing really hard in order to move a little site as jaco.it and there is nothing really hard in order to move an octopress web site but it’s my 1st step inside the heroku land and i’m sure that i’ll be curious enough to learn really fast the heroku power.
Intervista realizzata e pubblicata da ict4executive
Well, one of the most interesting feature in Django, is the cache framework. Every django developer use the django cache to speed up the performance. On the other side, Nginx it’s probably the fastest server for the static file management. Finally, in this story, we have also uwsgi and memcache, an application server really strong and a memory storage really fast.
Now, a classic deploy could be:
- nginx as reverse-proxy and to handle static file
- uwsgi as application server
- django as framework and cache management
- memcache as cache storage
So every request is handled by django. The chain is nginx – uwsgi – django – if the page is in memcache serve it from the cache, if not, exec a django view or something.
Our goal is another chain:
- nginx as reverse-proxy
- nginx to handle “every request if in cache”
- nginx to handle static file
- uwsgi as application server
- django only as cache writer
- memcahce as cache storage
So, let’s start:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37
Our settings.py will looks like:
1 2 3 4 5 6 7 8
And nginx site-enabled configuration:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
Where CUSTOMKEY è id the same used in settings.py
Cafeweb.it is a really fast growing and nice web site and i’m really enthusiastic about their interest in postacap.
I developed postacap.it just for fun and just to learn new technologies and i really never expected that it could be an useful (and used) web site but… it happened!
So now i’m really proud that a large site as cafeweb.it has acquired postacap, i’m sure that they will do their best to improve the service.
VaiSulWeb is a fast-growing European hosting service provider that serves mostly small-business customers from a modern private-cloud infrastructure built on the Windows Server 2012 operating system and managed by Microsoft System Center 2012. VaiSulWeb looked to drive competitive advantage and grow markets by upgrading to System Center 2012 Service Pack 1 and implementing Windows Azure Services for Windows Server. With these technologies, VaiSulWeb was able to rapidly deploy a multitenant management portal with which it can offer self-service websites and virtual machines at low cost and high scale. The customer’s experience is similar to what they would enjoy with Windows Azure, the Microsoft public cloud service, but delivered from VaiSulWeb’s own data centers. VaiSulWeb also lowered data center costs by 50 percent by slashing hardware needs and centralizing data center management.