Skip navigation

Category Archives: Uncategorized

The WSO2 framework provides comprehensive WSS, WSI security for SOAP and REST based web services with bindings in multiple languages including Java, PHP, Python, C, Ruby and many more. Unfortunately if you are attempting to compile this library extension for PHP > 5.3, you are going to have a bad time.

The first error you will run into is php zend_class_entry has no member named default_properties

The second error once you find a way around that one is php struct_php_core_globals has no member named safe_mode These are both due to changes made in PHP since 5.4, for “Safe Mode” specifically since the concept was deprecated in 5.3 and removed in 5.4, see PHP Safe Mode for more details.

The third error you may encounter is along the lines of error CHECKUID_CHECK_FILE_AND_DIR undeclared which is also due to deprecated/retired components of PHP.

Fortunately the fixes are few and easy, here are the patches:


@@ -458,8 +458,12 @@
     zend_hash_init(intern->, 0, NULL, ZVAL_PTR_DTOR, 0);
+#if PHP_VERSION_ID < 50399
     zend_hash_copy(intern->, &class_type->default_properties,
             (copy_ctor_func_t) zval_add_ref, (void *) & tmp, sizeof (void *));
+    object_properties_init((zend_object*) &(intern->, class_type);
     retval.handle = zend_objects_store_put(intern,
             (zend_objects_store_dtor_t) zend_objects_destroy_object,


@@ -1986,10 +1986,6 @@
 	if (VCWD_REALPATH(path, resolved_path_buff)) 
-		if (PG(safe_mode) && (!php_checkuid(resolved_path_buff, NULL, CHECKUID_CHECK_FILE_AND_DIR))) 
-		{
-			return NULL;
-		}
 		if (php_check_open_basedir(resolved_path_buff TSRMLS_CC)) 

You’ll notice that in wsf_util.c we simply removed that particular check because both functions/values no longer existed, there may be a better solution to this but for the moment we are able to compile. Rember to make clean then ./configure
sudo make install
and add the extension ini to /etc/php.d/


Having trouble finding the sources? Try the GitHub repo here or from the WSO2 site here. For some reason trying to wget that last URL resulted in 403 denied for me, but I was able to DL using a browser.

How to modify GitWeb to include the Google Prettify (PrettyPrint) JS and CSS to colorize your git repo web display, because why not. This applies to GitWeb 1.8.3, though finding the proper lines in previous version is not difficult.

To see the patch and the result in action, see the commit here.

HHVM, the HipHop Virtual Machine created by Facebook is pretty cool, bringing C like performance and scale-ability to the PHP crowd. PHP certainly is not my favorite language but if you need something deployed on a generic platform quickly and network accessible, PHP is a good starting point.

There are a number of good resources on the net for this, but like all the posts here they are notes for myself and what worked for me.

Get the required packages for compilation:

sudo yum install git svn cpp make autoconf automake libtool patch memcached gcc-c++ cmake wget boost-devel mysql-devel pcre-devel gd-devel libxml2-devel expat-devel libicu-devel bzip2-devel oniguruma-devel openldap-devel readline-devel libc-client-devel libcap-devel binutils-devel pam-devel elfutils-libelf-devel

For CentOS there are a couple libraries that are too old for HHVM, and we have to compile/install them first. Like libmcrypt, we need the development library as well:

cd ~/dev
wget ''
wget ''
rpm -Uhv libmcrypt-*.rpm

and GMP:

tar jxf gmp-5.1.3.tar.bz2 && cd gmp-5.1.3/
./configure --prefix=/usr/local/gmp
make && make install
cd ..

and mpfr:

tar jxf mpfr-3.1.2.tar.bz2 ;cd mpfr-3.1.2/
./configure --prefix=/usr/local/mpfr -with-gmp=/usr/local/gmp
make && make install 
cd ..

and mpc:

tar xzf mpc-1.0.1.tar.gz ;cd mpc-1.0.1
./configure --prefix=/usr/local/mpc -with-mpfr=/usr/local/mpfr -with-gmp=/usr/local/gmp
make &&make install
cd ..

make sure you have ncurses and ncurses-devel:

sudo yum install ncurses-devel

now build the cmake utility:

tar -xvzf cmake- 
cd cmake-
sudo make install

Google Glog, for this we are going to need libcurl which is not mentioned in the docs:

sudo yum install libcurl

now glog:

svn checkout google-glog
cd google-glog/
./configure --prefix=/usr
sudo make install

Now we need jemalloc:

tar -xvzf jemalloc-3.0.0.tar.bz2 
cd jemalloc-3.0.0
./configure --prefix=/usr
sudo make install

libmemcached, I was already running a Memcache serer under a few services, so for me I had to un-install and rebuild:

tar -xvzf libmemcached-1.0.17.tar.gz 
cd libmemcached-1.0.17
./configure --prefix=/usr
sudo service memcached stop
sudo yum remove libmemcached
sudo make install
cd ..

Now tbb, NOTE – Centos 6.3 does have tbb available but again it’s not recent enough so avoid the temptation to yum it. I also had to make some adjsutments to the install process, though this might be avoided with a more specific set of config flags:

wget ''
tar -xvzf tbb40_20120613oss_src.tgz 
cd tbb40_20120613oss
sudo mkdir -p /usr/include/serial
sudo cp -a include/serial/* /usr/include/serial/
sudo mkdir -p /usr/include/tbb
sudo cp -a include/tbb/* /usr/include/tbb/
sudo make install
sudo cp build/linux_intel64_gcc_cc4.7.3_libc2.12_kernel2.6.32_release/ /usr/lib64/
sudo ln -s /usr/lib64/ /usr/lib64/

libdwarf, I got lazy here and yum’d it and it worked, so moving on:

sudo yum install libdwarf libdwarf-tools libdwarf-devel

now boost, this one also took a while. Make sure you have python and related devel libs installed:

tar -xvjf boost_1_50_0.tar.bz2 
cd boost_1_50_0
./ --prefix=/usr --libdir=/usr/lib
./bjam --layout=system install
sudo ./bjam --layout=system install

and finally, GCC. this one takes a awhile, so find something to do. I actually used the newest version of GCC from which requires a newer version of the cmake utility::

tar jxf gcc-4.8.2.tar.bz2 ;cd gcc-4.8.2
./configure --prefix=/usr/local/gcc -enable-threads=posix -disable-checking -disable-multilib -enable-languages=c,c++ -with-gmp=/usr/local/gmp -with-mpfr=/usr/local/mpfr/ -with-mpc=/usr/local/mpc/

this compile line worked better for me on a seemingliy identical Centos image:

./configure --prefix=/usr --with-libdir=lib64 --with-gmp=/usr/lib64 --with-mpfr=/usr/lib64 --with-mpc=/usr/lib64

Now we are ready for HHVM. this is a large download even through GIT:

git clone git://
cd hhvm
export CMAKE_PREFIX_PATH=`pwd`/../usr
export HPHP_HOME=`pwd`

Build HHVM:

git submodule init
git submodule update --recursive
export HPHP_HOME=`pwd`
cmake .

If this works, you’re home. Check your install with the version command:

hhvm --version

This is not a walk-through, this is a code dump and much like a trash dump you are going to have to do a little digging to find anything useful within. So, say you have gitweb running on your git revision server, and a remote repo somewhere else that you’d like to be able to see which commit the remote repo is sitting at from the gitweb page. This is how:

The first version of this code only handled one repo, this one can be branched into as many as you need.

First off, you need to setup an extremely under privileged user on the host the remote repo sits on. Then you’ll need to install PHP with the ssh2_connect extension on the GitWeb host, along with SSH on both machines. THEN you need to create a set of keys on the remote repo machine and import them to the GitWeb host to use for authentication.

The php query script:

// We are using priv/pub key pair for this, they are protected with .htacess
// !!!! The user for this MUST be highly underpriv'd to avoid the risk of dangerous command execution. We do simple sanitation but the enemy is crafty. !!!!
// note: if you use the url instead of the ip in the connection setup, you will need to update the AllowUsers directive in the sshd config.
// We are restricting access for revuser to the IP of this server only.
class GetHead {

    // the no-priv's user on the host of the remote repo
    protected $sshUser = 'revuser';
    // the IP of the remote repo host
    protected $hostIP = '';
    // the SSH port on the remote repo host
    protected $hostPort = 22;
    // the path to the local rsa public key
    protected $rsaPub = '/var/www/git/auth/';
    // the path to the local RSA private key
    protected $rsaPriv = '/var/www/git/auth/id_rsa';

     * This function queries the remote host via SSH to determine the current HEAD revision of
     * each of the remote repos described within the incoming data.
     * $var $data The JSON encoded data from the ajax request.
     * @return void
    public function GetHead($data) {

        // our return array
        $rDAta = new stdClass();

        // where our json data will go
        $jData = new stdClass();

        // the incoming json deata decoded
        $jData = json_decode($_GET['paths']);

        // assume we are going to succedd
        $rData->success = true;

        // configure the connection using the local keys
        $connection = ssh2_connect($this->hostIP, $this->hostPort, array('hostkey' => 'ssh-rsa'));

        // attempt to authenticate
        if (ssh2_auth_pubkey_file($connection, $this->sshUser, $this->rsaPub, $this->rsaPriv)) {

            // iterate through the local repos and retrieve their HEAD
            foreach ($jData as $name => $local) {

                // the command we are going to exec
                $cmd = 'git --git-dir=' . $local->path . '.git --work-tree=' . $local->path . ' rev-parse HEAD';

                // capture the return stream, blocking is set to ensure that we have data before we try to read from the stream
                $stream = ssh2_exec($connection, $cmd);
                stream_set_blocking($stream, true);

                // $result is the current head revision
                $result = stream_get_contents($stream);

                // close the stream 

                // make sure we have something and it's not FALSE
                if (!empty($result)) {

                    // the return data
                    $rData->repos->$name->success = true;
                    $rData->repos->$name->head = str_replace(PHP_EOL, '', $result);
                } else {

                    // return the error message
                    $rData->repos->$name->success = false;
                    $rData->repos->$name->error = 'Error retrieving HEAD of remote repository.';
        } else {
            // fail
            $rData->success = false;
            $rData->error = 'SSH Authentication Failed, or missing PHP library.';

        // close the connection
        ssh2_exec($connection, 'exit');
        // return the data
        return $rData;

if ($_SERVER['REQUEST_METHOD'] === 'GET' && !empty($_GET['paths'])) {

    // the incoming data
    $data = filter_input($_GET['paths']);

    // init the class and attempt to get some answers
    $head = new GetHead($data);

    // send the data back as a json obj
    header('Content-Type: application/json');
    echo json_encode($head->getHead());

    // they are not asking the right questions
    echo 'Go away Dave.';
// done

And the Javascript. This script requires jQuery for the AJAX routine, it’s job is to query the above PHP to retrieve the current revision.

// update this object with the repos and their paths on production
var repoObject = {
    'repo.git' : { 
        'repo1' : {
            'path'   : '/var/www/vhosts/remoteRepo1/',
            'branch' : 'master'
        'repo2' : {
            'path'   : '/var/www/vhosts/remoteRepo2/',
            'branch' : 'master'
// this is our success function
function querySuccess(data){
    // if we have errors, display them
    if( 'error' in data ){
        for(repo in data.repos){
            if(data.repos.hasOwnProperty(repo)) {
                var repoObj = data.repos[repo];
                $('.' + repoObj.head).addClass('currentHead ' + repo);
// we run this on document ready for great justice
    var repoTitle = $('#project_name').val();
    // only check rev if we are within a recognized repo
    if (  repoTitle in repoObject) {
        var rData = {
            'paths' : JSON.stringify(repoObject[repoTitle])
            dataType: 'json',
            url: '/get_head.php',
            data: rData,
            success: function(data){
            error: function(){
                $('#query_error').text('There was an error connecting to the server or parsing the response.');
            beforeSend: function(){
            complete: function(){

Sometimes I fuck up, I know it’s hard to fathom but it happens. With women it’s usually permanent, they might be smiling and laugh at your jokes but the storm is brewing and she will have her revenge. Fortunately GIT is more willing to work with me in reconciling my mistakes. Hello GIT rebase.

There is enough documentation out there on rebase, so I’m just going to make a quick note here so that I can avoid swimming through Google results and the GIT manual.

Revert the commit:

git revert vd67abx2

Delete the last commit on branch master:

git push repoName +dc61lb62^:master

Delete a previous commit down to the last parent:

git rebase -i dc61bb12^

When the editor pops up, delete the lines of the commits you want to expunge from the repo. If you want to edit the commit message, change ‘pick’ to ‘edit’.

The internet is a bastion of opinion and bullshit. SELinux is a rather difficult bride and as such there is much disinformation regarding how to keep her happy. I have an Apache application that needs to update it’s own cache files and the Fedora installation it was under had the file under the web root with a context of ‘httpd_sys_content_t’. This didn’t work and many of the first suggestions I came across said either disable SELinux or set it to some peculiar context like ‘public_content_rw_t’ and then altering an SELinux boolean to allow such access. We can have no such shenanigans around here, this is how it should be done:

sudo chcon -R -t -P httpd_sys_content_rw_t /var/www/path/to/dir/

I did it, I destroyed the old site. The code, the database, everything. It was a cleansing experience that I’d liken to burning down an old house to start again.


I am Jack Brain and this is my internet home. This server is here for other purposes, it hosts a number of databases and web services that I use in work and play. These front pages are here for clarity and simple publication.