Skip navigation

I’ve recently gotten back into high end audio equipment with the purchase of a pair of Sennheiser HD 800 headphones. I’ve had a quality set of home shelf speakers for many years now but they aren’t something I can take into work or anywhere for that matter. For anyone who’s still with me on this, you know the difference a pair of high end speakers or headphones can make in the sound and clarity of music compared to even expensive ear buds or car stereo speakers. There really is no comparison.

The downside to this is that any of your music that has been compressed into MP3, M4A, etc. sounds thin, hollow and lifeless if recorded well to begin with. This has inspired me to begin to rebuild much of my music library that isn’t already in FLAC and along the way I thought I’d run a spectrum analysis of a song to document various levels of compression and the inherent data loss they incite.

Audio Sprectrum

The top graph represents the FLAC 44.1kHz sample, the middle represents 256kbs Mp3, and the bottom represents ~174kbps VBR Mp3. The frequency cutoff, banding and definition loss are obvious just looking at the graph and if you know the song well even more so upon listening. It is true you won’t be able to notice much of a difference withoput high quality audio components but given today’s high availability of cheap storage there is no good reason not to store your catalog in lossless or at least low-loss formats (>256kbs). Need to fit alot of music onto a smaller mobile listening device? Encode the music from the lossless sources when you need to, keeping the original as a backup (example).

The errors encountered during configure:
checking for sysvipc shared memory support... no
checking for mmap() using MAP_ANON shared memory support... no
checking for mmap() using /dev/zero shared memory support... no
checking for mmap() using shm_open() shared memory support... no
checking for mmap() using regular file shared memory support... no
checking "whether flock struct is linux ordered"... "no"
checking "whether flock struct is BSD ordered"... "no"
configure: error: Don't know how to define struct flock on this system, set --enable-opcache=no

I encountered these on a local system recently, a system where 5.5.12 had successfully compiled WITH opcache several weeks prior. One of the major performance advantages of the 5.5 generation is the opcache extension so disabling it was not an option. Long story short flock (file lock) structuring is passed to make by libtdl which is a part of the GNU libtool family and this info is required by the opcache extension to establish part of the memory mapping strategy during compilation. How these became lost or corrupted since the last install is a mystery, but I’m not Angela Lansbury and I need this to work because there is a particular LDAP related bugfix that may impact us so this is what needs to be done:

$sudo yum reinstall libtool libtool-ltdl libtool-ltdl-devel

This is a peculiar series of errors but I’m certain that I’m not the first to encounter. If you find yourself in need of the Authen::Libwrap perl library and have tried the usual install mechanisms YUM and CPAN without success then keep reading. the problem is you are missing a few packages not diretly listed in the prerequisites, namely tcp_wrappers, tcp_wrappers-devel, and the perl modules Local::Lib and Test::Exceptions. The Test module is used during the manual build of the Authen::Libwrap module. Soo…

$sudo yum install tcp_wrappers tcp_wrappers-devel
$sudo perl -MCPAN -e install Test::Exceptions
$sudo perl -MCPAN -e install Local::Lib

Then you’ll need to manually build the Authen::Libwrap libraries if your repo doesn’t have them:
$wget http://www.cpan.org/authors/id/D/DM/DMUEY/Authen-Libwrap-0.22.tar.gz
$cd Authen-Libwrap-0.22 && perl Makefile.PL && make
$make test
$sudo make install

If you start seeing messages like
validating @0xb4a348a98: choices-st.truste.com AAAA: no valid signature found
validating @0xb4224288: mozilla.com SOA: no valid signature found
validating @0xb42f74910: choices-st.truste.com AAAA: no valid signature found

in your syslog, then check your BIND config. On RedHat systems it’s located in (/etc/named.conf) and if DNSEC is enabled as it should be it will contain a set of configuration options that read:
dnssec-enable yes;
dnssec-validation yes;
dnssec-validation auto;
dnssec-lookaside auto;

The ambiguity here resides in the config line dnssec-validation yes; which instructs named to validate the signed keys but without further direction does not provide a set of root keys to compare against, which results in named not being able to validate the signatures.

To correct this, change the ‘yes’ option to ‘auto’ which will instruct named to use the set of compiled root keys that it ships with. Your DNSSEC should look something like this:
dnssec-enable yes;
dnssec-validation auto;
dnssec-lookaside auto;

Restart BIND/named and move on.

The WSO2 framework provides comprehensive WSS, WSI security for SOAP and REST based web services with bindings in multiple languages including Java, PHP, Python, C, Ruby and many more. Unfortunately if you are attempting to compile this library extension for PHP > 5.3, you are going to have a bad time.

The first error you will run into is php zend_class_entry has no member named default_properties

The second error once you find a way around that one is php struct_php_core_globals has no member named safe_mode These are both due to changes made in PHP since 5.4, for “Safe Mode” specifically since the concept was deprecated in 5.3 and removed in 5.4, see PHP Safe Mode for more details.

The third error you may encounter is along the lines of error CHECKUID_CHECK_FILE_AND_DIR undeclared which is also due to deprecated/retired components of PHP.

Fortunately the fixes are few and easy, here are the patches:

src/wsf.c:

@@ -458,8 +458,12 @@
 
     ALLOC_HASHTABLE(intern->std.properties);
     zend_hash_init(intern->std.properties, 0, NULL, ZVAL_PTR_DTOR, 0);
+#if PHP_VERSION_ID < 50399
     zend_hash_copy(intern->std.properties, &class_type->default_properties,
             (copy_ctor_func_t) zval_add_ref, (void *) & tmp, sizeof (void *));
+#else
+    object_properties_init((zend_object*) &(intern->std.properties), class_type);
+#endif
 
     retval.handle = zend_objects_store_put(intern,
             (zend_objects_store_dtor_t) zend_objects_destroy_object,

src/wsf_util.c:

@@ -1986,10 +1986,6 @@
 
 	if (VCWD_REALPATH(path, resolved_path_buff)) 
 	{
-		if (PG(safe_mode) && (!php_checkuid(resolved_path_buff, NULL, CHECKUID_CHECK_FILE_AND_DIR))) 
-		{
-			return NULL;
-		}
 
 		if (php_check_open_basedir(resolved_path_buff TSRMLS_CC)) 
 		{

You’ll notice that in wsf_util.c we simply removed that particular check because both functions/values no longer existed, there may be a better solution to this but for the moment we are able to compile. Rember to make clean then ./configure
make
sudo make install
and add the extension ini to /etc/php.d/

Done.

Having trouble finding the sources? Try the GitHub repo here or from the WSO2 site here. For some reason trying to wget that last URL resulted in 403 denied for me, but I was able to DL using a browser.

How to modify GitWeb to include the Google Prettify (PrettyPrint) JS and CSS to colorize your git repo web display, because why not. This applies to GitWeb 1.8.3, though finding the proper lines in previous version is not difficult.

To see the patch and the result in action, see the commit here.

This is a simple java app that uses the PDFBox library to locate text within a PDF document. This app is designed to be run from the command line, originally by a python script. Given a PDF it will parse the entire document and produce a comma delimited string of the identified word followed by the page number in parenthesis and the x/y coordinates within brackets of the top left corner of the first letter. Words are identified by character groupings and simple spaces and punctuation placement. For the most part all of the magic here is performed by PDFBox which is a fantastic library for parsing PDFs. This is a rough and rather featureless version of the one I used in production and could certainly use some improvement though it’s a good place to start if you can’t find a working example.

Output will be similar to:
[(1)[190.3 : 286.8] WORD1, (1)[283.3 : 286.8] WORD2, ...]

Dependencies:

package printtextlocations;

import java.io.File;
import java.io.IOException;
import java.text.DecimalFormat;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;

import org.apache.pdfbox.exceptions.InvalidPasswordException;
import org.apache.pdfbox.pdmodel.PDDocument;
import org.apache.pdfbox.pdmodel.PDPage;
import org.apache.pdfbox.pdmodel.common.PDStream;
import org.apache.pdfbox.util.PDFTextStripper;
import org.apache.pdfbox.util.TextPosition;

public class PrintTextLocations extends PDFTextStripper {

    public static StringBuilder tWord = new StringBuilder();
    public static String seek;
    public static String[] seekA;
    public static List wordList = new ArrayList();
    public static boolean is1stChar = true;
    public static boolean lineMatch;
    public static int pageNo = 1;
    public static double lastYVal;

    public PrintTextLocations()
            throws IOException {
        super.setSortByPosition(true);
    }

    public static void main(String[] args)
            throws Exception {
        PDDocument document = null;
        seekA = args[1].split(",");
        seek = args[1];
        try {
            File input = new File(args[0]);
            document = PDDocument.load(input);
            if (document.isEncrypted()) {
                try {
                    document.decrypt("");
                } catch (InvalidPasswordException e) {
                    System.err.println("Error: Document is encrypted with a password.");
                    System.exit(1);
                }
            }
            PrintTextLocations printer = new PrintTextLocations();
            List allPages = document.getDocumentCatalog().getAllPages();

            for (int i = 0; i < allPages.size(); i++) {
                PDPage page = (PDPage) allPages.get(i);
                PDStream contents = page.getContents();

                if (contents != null) {
                    printer.processStream(page, page.findResources(), page.getContents().getStream());
                }
                pageNo += 1;
            }
        } finally {
            if (document != null) {
                System.out.println(wordList);
                document.close();
            }
        }
    }

    @Override
    protected void processTextPosition(TextPosition text) {
        String tChar = text.getCharacter();
        System.out.println("String[" + text.getXDirAdj() + ","
                + text.getYDirAdj() + " fs=" + text.getFontSize() + " xscale="
                + text.getXScale() + " height=" + text.getHeightDir() + " space="
                + text.getWidthOfSpace() + " width="
                + text.getWidthDirAdj() + "]" + text.getCharacter());
        String REGEX = "[,.\\[\\](:;!?)/]";
        char c = tChar.charAt(0);
        lineMatch = matchCharLine(text);
        if ((!tChar.matches(REGEX)) && (!Character.isWhitespace(c))) {
            if ((!is1stChar) && (lineMatch == true)) {
                appendChar(tChar);
            } else if (is1stChar == true) {
                setWordCoord(text, tChar);
            }
        } else {
            endWord();
        }
    }

    protected void appendChar(String tChar) {
        tWord.append(tChar);
        is1stChar = false;
    }

    protected void setWordCoord(TextPosition text, String tChar) {
        tWord.append("(").append(pageNo).append(")[").append(roundVal(Float.valueOf(text.getXDirAdj()))).append(" : ").append(roundVal(Float.valueOf(text.getYDirAdj()))).append("] ").append(tChar);
        is1stChar = false;
    }

    protected void endWord() {
        String newWord = tWord.toString().replaceAll("[^\\x00-\\x7F]", "");
        String sWord = newWord.substring(newWord.lastIndexOf(' ') + 1);
        if (!"".equals(sWord)) {
            if (Arrays.asList(seekA).contains(sWord)) {
                wordList.add(newWord);
            } else if ("SHOWMETHEMONEY".equals(seek)) {
                wordList.add(newWord);
            }
        }
        tWord.delete(0, tWord.length());
        is1stChar = true;
    }

    protected boolean matchCharLine(TextPosition text) {
        Double yVal = roundVal(Float.valueOf(text.getYDirAdj()));
        if (yVal.doubleValue() == lastYVal) {
            return true;
        }
        lastYVal = yVal.doubleValue();
        endWord();
        return false;
    }

    protected Double roundVal(Float yVal) {
        DecimalFormat rounded = new DecimalFormat("0.0'0'");
        Double yValDub = new Double(rounded.format(yVal));
        return yValDub;
    }
}

WordPress and security are not the best of friends, but if you’re going to be dragged over the coals by Ivan you might as well make him work for it. Fail2Ban is a great little service to help stall brute force attempts against SSH and similar auth methods, it can also be used to monitor and block persistent failed authentications against WordPress and Webmin. Since wordpress does not automatically log failed login attempts, a simple plugin is required to provide fail2ban the proper notifications, that plugin is called “WP fail2ban” and can be found here. You will need to make a few configuration changes to fail2ban to get things working, these are the configurations that worked for me on Fedora:

WordPress jail.local (/etc/fail2ban/jail.local):

[wordpress]
enabled  = true
filter   = wordpress
logpath  = /var/log/messages
maxretry = 5
action   = iptables-multiport[name=wordpress, port="http,https", protocol=tcp]
           sendmail-whois[name=Wordpress, dest=root, sender=fail2ban@jackson-brain.com, sendername="The WordPress Bouncer"]

WordPress filter (/etc/fail2ban/filter.d/wordpress.conf):

_daemon = wordpress

# Option:  failregex
# Notes.:  regex to match the password failures messages in the logfile. The
#          host must be matched by a group named "host". The tag "" can
#          be used for standard IP/hostname matching and is only an alias for
#          (?:::f{4,6}:)?(?P[\w\-.^_]+)
# Values:  TEXT
#
failregex = ^%(__prefix_line)sAuthentication failure for .* from $
            ^%(__prefix_line)sBlocked authentication attempt for .* from $
            ^%(__prefix_line)sBlocked user enumeration attempt from $

# Option:  ignoreregex
# Notes.:  regex to ignore. If this regex matches, the line is ignored.
# Values:  TEXT
#
ignoreregex =

For Webmin, all I needed to do was update the [webmin-auth] section to properly reflect the location of failed webmin login attempts:

[webmin-auth]

enabled = true
filter  = webmin-auth
action  = iptables-multiport[name=webmin,port="10007"]
logpath = /var/log/secure

Webmin makes certain things easy when managing remote Unix/Linux servers, some things it makes more difficult if only because its modules don’t get updated very often. Shorewall makes managing large iptables rule sets easy but it’s Webmin interface is outdated. For instance the Blacklist section in the Shorewall Webmin Module directs to ‘/etc/shorewall/blacklist’ which according to the Shorewall documentation: ‘The blacklist file is used to perform static blacklisting by source address (IP or MAC), or by application. The use of this file is deprecated and beginning with Shorewall 4.5.7, the file is no longer installed.’

The Shorewall Webmin module still directs the user to this file for modification and because of this changes are not effected. The file you should be looking at is ‘/etc/shorewall/blrules’ as documented here.

When attempting to compile PHP on Centos 6.x, you might run into a compile error such as:
php pdo/php_pdo.h: No such file or directory
and
php pdo/php_pdo_driver.h: No such file or directory

These files do exist, just not in the location that the configure script looks for them. There are two ways to fix this, the first would be to modify the configure script to look in the proper place and the second would be to create two symbolic links for the rogue files. I chose the second method.

The files are in *ext/pdo/, but the configure script looks in *pdo/ so we want to make the pdo directory and create the links within:

make clean
mkdir pdo
ln -s ext/pdo/php_pdo.h pdo/php_pdo.h
ln -s ext/pdo/php_pdo_driver.h pdo/php_pdo_driver.h

OR, more simply…

ln -s ./ext/pdo

Now re-configure and compile. Done.