NPM and Yeoman update/install notes for MacOS

Had to deal with some issues introduced by bad homebrew installations and possible old NPM installs. My guess is that I probably cheated/screwed up using a sudo install when I should have done something different.

This all started playing with the WebDevStudions WordPress plugin generator for yeoman.

Get a Plugin Kickstart with Yeoman & generator-plugin-wp!

This gist was very helpful – Fixing npm On Mac OS X for Homebrew Users

Another strange issue was caused by installing the generator-plugin-wp yoeman generator on a broken yeoman/npm system. This posted github issue helped. https://github.com/npm/npm/issues/10995 This needed a straightforward uninstall, cache clear, and reinstall. I may have reintialized my terminal sessions to drop out of sudo mode. It looked something like:

$ sudo npm remove -g yo generator-plugin-wp
$ npm cache clean
-- can't remember if I relaunched terminal to drop out of sudo mode.
$ npm install -g generator-plugin-wp

I’ll be testing this yeoman generator on the side for a while: https://github.com/WebDevStudios/generator-plugin-wp

Possum Wasp

So last night I attempted to fight off an urban possum with a can of wasp spray. It was all I had and the possum didn’t exactly run off and instead walked away sort of angry looking. Now if my poor choice of weaponry caused some angry mutation in downtown Houma you have my apologies.

IMG_0126

Spying on a directory with auditd

Files start coming up missing for me on a server and I get freaked out looking for security holes, but sometimes users and other utilities are spiking the bunch bowl. You can get serious with watching files with other utilities, but I went back to good ole auditd.

A simple test to track stuff getting trashed from an upload folder:

auditctl -w /site-dir/wp-content/uploads/ -p wa -k upload_issue

A capital W will remove the rule:

auditctl -W /site-dir/wp-content/uploads/ -p wa -k upload_issue

Do a quick search for issues with ausearch.

ausearch -f wp-content/uploads

Now permanently add the rule on a redhat system by putting this line in /etc/audit/audit.rules. Just leave off the auditctl command.

 -w /site-dir/wp-content/uploads/ -p wa -k upload_issue

Of course you need to make sure your auditd process is running and using chkconfig, etc. Good ole check status like:

/etc/init.d/auditd status

Here are a few of the resources I used:

Please forgive the RedHat auth-walls…

https://access.redhat.com/documentation/en-US/Red_Hat_Enterprise_Linux/7/html/Security_Guide/sec-Defining_Audit_Rules_and_Controls.html
http://stackoverflow.com/questions/29519590/monitor-audit-file-delete-on-linux
http://www.cyberciti.biz/tips/linux-audit-files-to-see-who-made-changes-to-a-file.html

My Favorite Httrack commands

HTTrack is a website mirroring utility that can swamp your disks with mirror copies of the internet. I’ve had to use it several times to make off-line copies of websites for all sorts of weird reasons. You’ll find HTTrack at: www.httrack.com. You can get a full list of command line options at: https://www.httrack.com/html/fcguide.html. There is a spiffy web and Windows wizard interface for HTTrack, but I gave that up.

This is the recipe for the command line options I’ve been using to produce a browse-able offline version of accreditation documents. This command says “Make an offline mirror of these URLs, go up to 8 links deep on these sites and 2 links deep on other domains. Stay on the TLD (.edu) and do it as quickly as possible. Be warned as it currently stands this will fill up about 1.5GB of disk space ;P.

httrack http://www.nicholls.edu/sacscoc-2016/ http://www.nicholls.edu/catalog/2014-2015/html/ http://www.nicholls.edu/about/ -O /Users/nichweb/web-test -r8 -%e1 -%c16 -*c16 -B -l -%P -A200000

The great part is that the archive grows as URLs are added.

Apache log one-liners using tail, awk, sort, etc.

Good bunch of samples with other examples found at: https://blog.nexcess.net/2011/01/21/one-liners-for-apache-log-files/

# top 20 URLs from the last 5000 hits
tail -5000 ./transfer.log | awk '{print $7}' | sort | uniq -c | sort -rn | head -20
tail -5000 ./transfer.log | awk '{freq[$7]++} END {for (x in freq) {print freq[x], x}}' | sort -rn | head -20
 
# top 20 URLS excluding POST data from the last 5000 hits
tail -5000 ./transfer.log | awk -F"[ ?]" '{print $7}' | sort | uniq -c | sort -rn | head -20
tail -5000 ./transfer.log | awk -F"[ ?]" '{freq[$7]++} END {for (x in freq) {print freq[x], x}}' | sort -rn | head -20
 
# top 20 IPs from the last 5000 hits
tail -5000 ./transfer.log | awk '{print $1}' | sort | uniq -c | sort -rn | head -20
tail -5000 ./transfer.log | awk '{freq[$1]++} END {for (x in freq) {print freq[x], x}}' | sort -rn | head -20
 
# top 20 URLs requested from a certain ip from the last 5000 hits
IP=1.2.3.4; tail -5000 ./transfer.log | grep $IP | awk '{print $7}' | sort | uniq -c | sort -rn | head -20
IP=1.2.3.4; tail -5000 ./transfer.log | awk -v ip=$IP ' $1 ~ ip {freq[$7]++} END {for (x in freq) {print freq[x], x}}' | sort -rn | head -20
 
# top 20 URLS requested from a certain ip excluding, excluding POST data, from the last 5000 hits
IP=1.2.3.4; tail -5000 ./transfer.log | fgrep $IP | awk -F "[ ?]" '{print $7}' | sort | uniq -c | sort -rn | head -20
IP=1.2.3.4; tail -5000 ./transfer.log | awk -F"[ ?]" -v ip=$IP ' $1 ~ ip {freq[$7]++} END {for (x in freq) {print freq[x], x}}' | sort -rn | head -20
 
# top 20 referrers from the last 5000 hits
tail -5000 ./transfer.log | awk '{print $11}' | tr -d '"' | sort | uniq -c | sort -rn | head -20
tail -5000 ./transfer.log | awk '{freq[$11]++} END {for (x in freq) {print freq[x], x}}' | tr -d '"' | sort -rn | head -20
 
# top 20 user agents from the last 5000 hits
tail -5000 ./transfer.log | cut -d  -f12- | sort | uniq -c | sort -rn | head -20
 
# sum of data (in MB) transferred in the last 5000 hits
tail -5000 ./transfer.log | awk '{sum+=$10} END {print sum/1048576}'

Using HyperDB to separate and share user and user_meta between WordPress installations

I need to remember to keep this example for some testing. This should be a good start for sharing a user and user_meta between websites. I do know that user_meta tends to have very site-centric settings at times. Original article was located at: http://wordpress.aspcode.net/view/63538464303732726666099/how-to-use-hyperdb-to-separate-and-share-a-user-dataset-between-wordpress-installs

$wpdb->add_database(array( //Connect to Users Database
    'host'     => DB_HOST, // I am using the same host for my two DBs
    'user'     => DB_USER, // I am using the same username for my two DBs
    'password' => DB_PASSWORD, // I am using the same p/w for my two DBs
    'name'     => 'my_user_db_name', 
    'write'    => 0, // Change to 1 if you want your slave site's the power to update user data.
    'read'     => 1,
    'dataset'  => 'user',
    'timeout'  => 0.2,
));

$wpdb->add_database(array( // Main Database
    'host'     => DB_HOST,
    'user'     => DB_USER,
    'password' => DB_PASSWORD,
    'name'     => DB_NAME,
));

$wpdb->add_callback('user_callback');
function user_callback($query, $wpdb) {
    if ( $wpdb->base_prefix . 'users' == $wpdb->table || $wpdb->base_prefix . 'user_meta' == $wpdb->table) {
        return 'user'; 
    }
}

Create a new Git repo from and old repo

How to extend an old repository as a full copy in a new repository. This preserves the history of the old repository. Future changes will not affect the old repository, but will be committed to the new repository.

This originally came from the info found at: http://stackoverflow.com/questions/10963878/how-do-you-fork-your-own-project-on-github

// This makes the new repo as a checkout of the old repo to a new directory.
# git clone https://github.com/nicholls-state-university/nicholls-2012-core.git nicholls-2015-core
// Change directory to new repo area
# cd nicholls-2015-core
// Change the origin to the new repo. Remember to make the new repo area.
# git remote set-url origin https://github.com/nicholls-state-university/nicholls-2015-core.git
// Push commits to new area.
# git push origin master
// Push all changes to repo, just making sure.
# git push --all