The ingredients to add a custom jQuery script to a Wordpress theme:
1) custom function in the theme’s functions.php file;
2) jQuery file;
3) add_action() line
Before you start, please install the Google Libraries plugin !
1) in [website root]\wordpress\wp-content\themes\current_theme\functions.php add these lines:
<?php
// My own nitwit jQuery script
function nitwit () {
wp_enqueue_script('shittycode', THEME . '/scripts/myscript/alert.js', array('jquery'), '1.0', false);
} // End nitwit()
add_action( 'wp_print_scripts', 'nitwit', 1 );
//
?>
For each jQuery file you wish to include, add another wp_enqueue_script() line to the function. This line reads as follows:
The name by which I would like to refer to this jQuery file is 'shittycode'. It is located at THEME . ‘/scripts/myscript/alert.js’ *, requires jQuery to run and has a version number of 1.0. The `false` at the end denotes whether or not the script is to be loaded in the footer of the theme (on wp_footer()). If false, the script will load in wp_head() of the current theme.
2) Put a script file named alert.js in [website root]\wordpress\wp-content\themes\current_theme\scripts\myscript\,
contents:
$(document).ready(function() {
alert("Hello world!");
});
3) The add_action() line has to be put directly beneath the function.
---------
Explanation of wp_print_scripts
Prints script tags in document head. Called by admin-header.php and by wp_head hook. Since it is called by wp_head on every page load, the function does not instantiate the WP_Scripts object unless script names are explicitly passed. Does make use of already instantiated $wp_scripts if present. Use provided wp_print_scripts hook to register/enqueue new scripts.
---------
Result: when the WordPress wp_print_scripts() function runs, the nitwit () function is added to the list of functions to run. It has been given a priority setting of 1 (this can be used to control which functions execute before others).
Specific pages:
To include the jQuery file on only one particular page, use a specific action hook. Action hooks are essentially placeholders. Wherever an action hook is placed, it will execute any code that has been “hooked” to it. Let’s try to visualize this with some default WordPress action hooks that are in most themes. You can find wp_head and wp_footer in just about every single theme available, and most people don’t realize these are action hooks. They’re simply placeholders that plugins can use to insert code into the <head> and footer of the theme. Often times, they use these action hooks to insert things like CSS or Analytics code. They create a function that generates the code, and then “hook” that function to either wp_head or wp_footer.
Examples:
add_action('wp_head', 'nitwit'); // action hook: wp_head
add_action('wp_footer', 'nitwit'); // action hook: wp_footer
If we want to use all pages (and not posts) as an action hook, we have to use the IF statement, like this:
function nitwit () {
if( is_page() ) {
wp_enqueue_script('shittycode', THEME . '/scripts/calendar/alert.js', array('jquery'), '1.0', false);
}}
add_action( 'wp_head', 'nitwit', 1 );
We can filter this a bit more, so that the function is only available on the page which id is 30:
function nitwit () {
if( is_page(30) ) {
wp_enqueue_script('shittycode', THEME . '/scripts/calendar/alert.js', array('jquery'), '1.0', false);
}}
If we want to use the function on multiple pages, we have to use a range:
range( 1, 10 )
means, all page ID's between 1 and 10 use the function, pages with other ID's do not.
function nitwit () {
if( is_page(range(30,35)) ) {
wp_enqueue_script('shittycode', THEME . '/scripts/calendar/alert.js', array('jquery'), '1.0', false);
}}
We can check this by changing the alert.js file like this:
$(document).ready(function() {
$('#mydiv').html('hello world!');
});
Add this line to page.php
<div id="mydiv"></div>
Now, any page with ID's between 30 and 35 will show the line 'hello world!'
Amazon S3 (http://aws.amazon.com/s3/) is storage for the Internet. It provides a simple web services interface that can be used to store and retrieve any amount of data, at any time, from anywhere on the web. It gives any developer access to the same highly scalable, reliable, secure, fast, inexpensive infrastructure that Amazon uses to run its own global network of web sites. The service aims to maximize benefits of scale and to pass those benefits on to developers. In short: it's a nice remote place to store your precious backups !
First, install these packages on your NAS:
ipkg
python
sudo
More information about this:
http://stam.blogs.com/8bits/2011/11/ssh-tunnel-with-synology-nas.html
http://stam.blogs.com/8bits/2011/11/how-to-run-python-27-on-synology-nas.html
Download the latest S3cmd (http://s3tools.org/download) and copy it to the public or temp folder on the NAS. At the moment, the latest version of s3cmd is s3cmd-1.0.1.tar.gz
Log into the NAS using SSH.
Bash commands:
cd /volume1/public
tar -xzf s3cmd-0.9.9.tar.gz
rm s3cmd-0.9.9.tar.gz
mv s3cmd-0.9.9 s3cmd
cd s3cmd
sudo python2.7 ./s3cmd --configure
You'll need your Amazon S3 public and private keys (you can get them here: http://aws.amazon.com/s3/)
Now, to backup the data of your NAS, create a bucket (http://aws.amazon.com/s3/#getting-started) and create a folder inside that bucket. Then, use the command line on the NAS (login with SSH and you're good to go):
sudo python2.7 /volume1/public/s3cmd/s3cmd sync --delete-removed --exclude '@eaDir*' --exclude 'Thumbs.db' /volume1/backup/ s3://<bucketname>/<foldername>/
If it doesn't work, maybe you can use Python 2.5, like this:
sudo python2.5 /volume1/public/s3cmd/s3cmd sync --delete-removed --exclude '@eaDir*' --exclude 'Thumbs.db' /volume1/backup/ s3://<bucketname>/<foldername>/
The contents of the backup folder on the NAS will be uploaded (synced) to the S3 bucket.
Now, you can create a cronjob on the NAS to automate your backups.
More information about that: http://www.google.nl?q=synology+crontab
After editing /etc/crontab, use the following command to restart the crontab daemon:
sudo synoservice --restart crond
sudo /usr/syno/etc.defaults/rc.d/S??crond.sh stop
sudo /usr/syno/etc.defaults/rc.d/S??crond.sh start
Further reading:
http://forum.synology.com/wiki/index.php/How_to_backup_the_Synology_Server_to_Amazon_S3
A simple walkthrough to install Python 2.7 on Synology NAS:
1) install ipkg
Read this blogposting for more information about installing ipkg:
http://setaoffice.com/2011/04/08/how-to-install-compiled-programs-on-a-synology-nas/
2) run 'ipkg update'
3) run 'ipkg install python27'
It will install Python and all dependencies
4) now, the command 'python2.7' will start the Python command line
print 'hello world';
There are many other packages which can be installed on the Synology NAS with Optware, take a look at this website:
http://www.nslu2-linux.org/wiki/Optware/HomePage
complete list: http://www.nslu2-linux.org/wiki/Optware/Packages
Why don't you install nano, sudo etc., it will definitely make your life easier!
Here's my solution to create an SSH tunnel over Synology NAS, in order to browse the internet in a safe way.
Unfortunately, there were some errors in my last posting about location based tweets. I fixed it with Google App Engine's 'urlfetch'.
Now, the feed with all location based tweets about 'beer' in the area around the Dam in Amsterdam:
http://geotwittsearch.appspot.com/?w=beer&nb=52.368674&el=4.890467&km=2
You can simply add the feed to Google Reader.
Use it with Google Maps like this:
Note:
if you don't want to search for a keyword and just want all available location based tweets in a specific area, you can use the * wildcard, for example:
http://maps.google.com/?q=http://geotwittsearch.appspot.com/?w=*%26nb=52.368674%26el=4.890467%26km=2
read the new blogposting about location based tweets
I've created a script which parses Twitter (almost) realtime and shows the location of the Twitter user. Just change the keyword you want to look for ('w' variable) and use the GPS locations for the area you want to monitor ('nb' and 'el' variables).
If all works well (and the Twitter API is not down), you can open the feed in a standard RSS feedparser or you can use it with Google maps.
I'm updating my scripts almost daily, so if you want to know more or if you have any ideas about the practical use of Twitter, let me know !
Microsoft Security Essentials has warned me about the fact that my Windows pc contained malware and that it had cleaned my system. Time to do a test with some online virusscanners. First, I've tried TrendMicro Housecall. Housecall found nothing and told me my system was safe and didn't contain malware. Then, I tried Eset Online scanner. Eset did find the specific malware on my Windows pc. This, while Microsoft Security Essentials told me it had deleted the malware and TrendMicro Housecall told me my system was safe. Well done Eset !!
Google App Engine manages the scaling and maintenance of data storage automatically. GAE's abstraction for data is easy to understand, but it is not obvious how to best take advantage of it's features. In particular, it's suprisingly different from the relational database. Google doesn't call it a 'database', but a 'datastore'. It's best understood as an object database. An object in the datastore is known as an entity.
Continue reading "Easy Google App Engine examples: DataStore" »
Recent Comments