[WordPress] filter out unneeded menu classes

Here’s a code snippet for WordPress to filter out unneeded menu classes. Put the following line in your functions.php

// Reduce nav classes, leaving only 'current-menu-item'
function nav_class_filter($var)
{
    return is_array($var) ? array_intersect($var, array(
        'menu',
        'menu-main',
        'menu-primary',
        'menu-item',
        'sub-menu',
        'menu-last-item',
        'menu-first-item',
        'menu-noparent',
        'menu-parent',
        'menu-top',
        'current-menu-item'
    )) : '';
}
add_filter('nav_menu_css_class', 'nav_class_filter', 100, 1);
add_filter('nav_menu_item_id', 'my_css_attributes_filter', 100, 1);

[WordPress] set cookie

Here’s a code snippet for WordPress. Since WordPress doesn’t support any sessions a cookie might be useful. Here’s a code snippet to use for your functions.php

//Set cookie
function set_newuser_cookie() {
	if (!isset($_COOKIE['sitename_newvisitor'])) {
		setcookie('sitename_newvisitor', 1, time()+1209600, COOKIEPATH, COOKIE_DOMAIN, false);
	}
}
add_action( 'init', 'set_newuser_cookie');

[WordPress & Genesis] Add parent and child classses to menu

Here’s another code snippet for WordPress Genesis Framework.

If you would like to add menu classes to parent and child menu’s, use the code below.
Put this in the functions.php file:


// Function to add parent and child classses to menu
class Arrow_Walker_Nav_Menu extends Walker_Nav_Menu
{
    function display_element($element, &$children_elements, $max_depth, $depth = 0, $args, &$output)
    {
        $id_field = $this->db_fields['id'];
        if (0 == $depth) {
            $element->classes[] = 'menu-top'; //top main menu
            if (empty($children_elements[$element->$id_field])) {
                $element->classes[] = 'menu-noparent'; //no childs
            }
        }
        if (!empty($children_elements[$element->$id_field])) {
            $element->classes[] = 'menu-parent'; //child in menu
        }
        Walker_Nav_Menu::display_element($element, $children_elements, $max_depth, $depth, $args, $output);
    }
}

[WordPress & Genesis] Add menu classes to first en last menu items

Here’s a code snippet for WordPress Genesis Framework.

If you would like to add menu classes to first en last menu items, use the code below. Put this in the functions.php file:

// Function to add menu classes to first en last menu items
function add_first_and_last($items)
{
    $items[1]->classes[]             = 'menu-first-item';
    $items[count($items)]->classes[] = 'menu-last-item';
    return $items;
}
add_filter('wp_nav_menu_objects', 'add_first_and_last');

[Synology] How to secure photostation with htaccess

Here’s a short instruction on how to protect your synology photostation by using htaccess:

Create the following file: /volume1/@appstore/PhotoStation/photo/.htaccess

AuthName "Restricted Area"
AuthType Basic
AuthUserFile /volume1/@appstore/PhotoStation/photo/.htpasswd
AuthGroupFile /dev/null
require valid-user

and the following file: /volume1/@appstore/PhotoStation/photo/.passwd

admin:xxxxxxxxxxxxxxxx

Use the passwd htaccess online generate for your own password.

Synology: Monitoring Apache with mod_status

Here another quick manual. If you would like to monitor your apache webserver, you can do that with mod_status. Open the httpd.conf-user file:

pico /usr/syno/apache/conf/httpd.conf-user

Copy paste the following content from below:

<Location /server-status>
   SetHandler server-status
   Order Deny,Allow
   Deny from all
   Allow from all
</Location>

Save the httpd.conf-user file and restart apache with:

/usr/syno/etc/rc.d/S97apache-user.sh restart

You can now obtain the apache server status by querying the following url:

http://diskstation/server-status?auto

This might be useful for people working with cacti.

Synology: Run sabnzbd behind apache

Here’s a quick instruction for those who would like to run sabnzbd behind apache on a synology nas system. Open up a ssh connection, create the following file:

nano /usr/syno/etc/sites-enabled-user/sabnzbd.conf

Copy paste the contents below to this file. Please note that my sabnzbd port is 9090. If you would like to change this, change the config file below.

# Put this after the other LoadModule directives
LoadModule proxy_module /usr/syno/apache/modules/mod_proxy.so
LoadModule proxy_http_module /usr/syno/apache/modules/mod_proxy_http.so

<Location /sabnzbd>
order deny,allow
deny from all
allow from all
ProxyPass http://localhost:9090/sabnzbd
ProxyPassReverse http://localhost:9090/sabnzbd
</Location>

Save the file by using control-x. Restart apache with the following command:

/usr/syno/etc/rc.d/S97apache-user.sh restart

Environment variable HADOOP_CMD must be set before loading package rhdfs

Mocht je onder Ubuntu met R onverhoop de volgende melding krijgen:

Error : .onLoad failed in loadNamespace() for 'rhdfs', details:
  call: fun(libname, pkgname)
  error: Environment variable HADOOP_CMD must be set before loading package rhdfs

Probeer dan de volgende regel toe te voegen aan het /etc/environment bestand:

HADOOP_CMD="/usr/local/hadoop/bin/hadoop"

Wellicht lost dit het bovenstaande probleem op!

Tuning Mapreduce Jobs

Mocht je met mapreduce willen tunen dan is het handig om een aantal parameters in de gaten te houden. De volgende parameters kunnen van belangrijk zijn:

mapred.tasktracker.map.tasks.maximum = The maximum number of map tasks that will be run simultaneously by a task tracker.
mapred.tasktracker.reduce.tasks.maximum = The maximum number of reduce tasks that will be run simultaneously by a task tracker.
mapred.reduce.tasks = The default number of reduce tasks per job.
mapred.map.tasks = The default number of map tasks per job. Ignored when mapred.job.tracker is “local”.

Deze zijn alle te configureren in het mapred-site.xml bestand. Hier een voorbeeld:

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<!-- In: conf/mapred-site.xml -->
<property>
  <name>mapred.job.tracker</name>
  <value>node1:54311</value>
</property>
<property>
  <name>mapred.tasktracker.map.tasks.maximum</name>
  <value>8</value>
</property>
<property>
  <name>mapred.tasktracker.reduce.tasks.maximum</name>
  <value>8</value>
</property>
<property>
  <name>mapred.reduce.tasks</name>
  <value>10</value>
</property>
<property>
  <name>mapred.map.tasks</name>
  <value>10</value>
</property>
</configuration>

Een korte test kan je bijvoorbeeld laten uitvoeren door PI te laten uitrekenen door Hadoop:

hadoop jar /usr/local/hadoop/hadoop-examples-1.0.3.jar pi 10 10
hadoop dfs -rmr /user/hduser/PiEstimator_TMP_3_141592654

Meer informatie over tuning is hier te vinden: Pro Hadoop Ch. 6

Rstudio & Ubuntu & Hadoop

Het is mij gelukt om Rstudio te laten draaien op het Hadoop cluster. Rstudio is een werk-, programmeeromgeving, voor de programmeertaal R.

Om rStudio te installeren onder Ubuntu moeten de volgende commando’s worden uitgevoerd:

sudo apt-get install r-base
sudo apt-get install gdebi-core
sudo apt-get install libssl0.9.8 libapparmor-dev
wget http://download2.rstudio.org/rstudio-server-0.96.331-i386.deb
sudo gdebi rstudio-server-0.96.331-i386.deb

De installatie kan vervolgens gevalideerd worden met:

sudo rstudio-server verify-installation

De volgende stap is om in te loggen op port 8787 van de server waarop Rstudio is geinstalleerd:

http://<server-ip>:8787

Inloggen kunnen we met de hduser uit de eerdere tutorials die te vinden zijn op deze website. Belangrijk is dat bij het afvuren van script de juiste environments worden meegegeven.

Een korte mapreduce test kan je uitvoeren met:

Sys.setenv(HADOOP_CMD="/usr/local/hadoop/bin/hadoop")
Sys.setenv(HADOOP_HOME="/usr/local/hadoop")
Sys.setenv(HADOOP_STREAMING="/usr/local/hadoop/contrib/streaming/hadoop-streaming-1.0.3.jar")
library(rmr)
mapreduce(input=small.ints, map=function(k,v) keyval(v, v^2))

Bekijk hierbij goed de log files op eventuele foutmeldingen!

De environment variables kunnen met dit commando worden bekeken:

Sys.getenv()

Hier is een uitgebreider script wat goed moet werken. Hiermee worden de functies mapreduce en hdfs getest.

#!/usr/bin/env Rscript

# Copyright 2011 Revolution Analytics
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#      http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

## classic wordcount
## input can be any text file
## inspect output with from.dfs(output) -- this will produce an R list watch out with big datasets

Sys.setenv(HADOOP_CMD="/usr/local/hadoop/bin/hadoop")
Sys.setenv(HADOOP_HOME="/usr/local/hadoop")
Sys.setenv(HADOOP_STREAMING="/usr/local/hadoop/contrib/streaming/hadoop-streaming-1.0.3.jar")

library(rmr)
library(rhdfs)

## @knitr wordcount
wordcount = function (input, output = NULL, pattern = " ") {
  mapreduce(input = input ,
            output = output,
            input.format = "text",
            map = function(k,v) {
                      lapply(
                         strsplit(
                                  x = v,
                                  split = pattern)[[1]],
                         function(w) keyval(w,1))},
                reduce = function(k,vv) {
                    keyval(k, sum(unlist(vv)))},
                combine = T)}
## @knitr end

rmr:::hdfs.put("/etc/passwd", "/tmp/wordcount-test")
file.remove("/tmp/wordcount-test")
file.copy("/etc/passwd",  "/tmp/wordcount-test")
rmr.options.set(backend = "local")
out.local = from.dfs(wordcount("/tmp/wordcount-test", pattern = " +"))
rmr.options.set(backend = "hadoop")
out.hadoop = from.dfs(wordcount("/tmp/wordcount-test", pattern = " +"))

Hier een uitgebreidere handleiding:

http://www.strengholt-online.nl/wp-content/uploads/hadoop/RHadoop.pdf
http://www.strengholt-online.nl/wp-content/uploads/hadoop/LocalVM.pdf

blog about anything technical…