Archive for Server Admin

Keep FTP in sync with a Git repo

I was tasked with making some changes to a site today over FTP.  It seems odd that people are still OK with letting developers push and pull files from FTP without so much as a change log or automated linting, testing, etc.  Anywho, I tried to find my cowboy hat but it snowed yesterday so all my summer gear is put away and since it’s moderately inappropriate to do cowboy things while looking like a snowboarder, I had to come up with a better way to make working on files over FTP less Wild West and more Gnar Gnar.

TL;DR

  1. Setup a cron script to maintain a local mirror of the remote FTP using lftp and automatically commit the changes to a hosted git repo.
  2. Setup a project in Jenkins to monitor the git repo for changes
    1. Ignore the commits created by the cron task mirror script
    2. Lint the project
    3. Execute a reverse mirror lftp script to push the local changes to the remote FTP host also deleting the files no longer relevant.

Read more

Make a Belkin F5D7230-4 v1010 Useful with DD-WRT

While moving to Greeley my Netgear WNDR3400v2 fell a part.  Being lazy I had super glued it to the inside leg of a metal desk I had and removing it with force didn’t work out so well so I borrowed an an old Belkin F5D7230-4 from my brother in law only to realize the firmware on it was horrid.  It only supported WEP encryption and you couldn’t even turn the wireless of.  It was a mess and rather than even wasting the time attempting to update the proprietary firmware I flashed it with DD-WRT firmware.  It took less than 5 minutes to get rolling with the DD-WRT Micro.  It was super simple to install from my Mac.  Here’s how I got it done:

Read more

New DD-WRT Firmware for Netgear WNDR3400v2 w/Heartbleed fix

This is an update to my previous post I installed DD-WRT on a Netgear WNDR3400v2.  There’s a new DD-WRT firmware available for the Netgear WNDR3400v2 available over at MyOpenRouter.com.  I haven’t flashed my router with it yet but a friend Dave Compton has and he says it’s smooth sailing.  Below you’ll find a few e-mails we shared back and forth on the topic.  The bottom line here is that if you’re considering using DD-WRT on your WNDR3400v2 you should use the firmware from this post linked above (and below in the email message) instead of from my original post.


From: Dave
Sent: ‎7/‎13/‎2014 9:54 PM
To: Josh Houghtelin
Subject: Re: DD-WRT on a Netgear WNDR3400v2

I’ve been using the new version for a week or so without any problems.   The major benefit for me is just that I’m running a more up-to-date version which I assume ( hope ) has bug fixes that the earlier version did not.

Aside from that, the only benefit that I see is that the default iptables includes a rule that implements “NAT loopback”.  Previously, I had added a similar rule by hand to accomplish the same thing.  In fact, the ability to do that was my primary reason for installing dd-wrt in the first place.  Given that I already knew how to do this, it’s not too important to me now.  However, if the previous version had had this rule built in, it would have saved me some trouble – and cost me a learning experience.

– Dave

On Thu, Jul 3, 2014 at 1:15 PM, Dave wrote:

It lists these new features:

——————————————————————————-
Includes Heartbleed Protection
– Router rename
– Duplicate MAC address of network interfaces of two same router
– Mount multiple partitions with different filesystems

New features:
– rflow
– ntfs-3g
– epi_ttcp
– NAT loopback
– inotify
– OpenVPN

——————————————————————————-

So far I’ve been using the earlier version since January with no problems.  Every now and then I wonder if using a version of dd-wrt that’s was created by “some guy on the internet” more than a year ago might not be a good idea though. Today I looked and saw the update.

http://www.myopenrouter.com/download/44198/DD-WRT-for-NETGEAR-WNDR3400v2-With-Heartbleed-Protection/

I’d feel better about the binaries if I downloaded the source and built it myself but from what I’ve read that looks like it might be tricky.   But at least this version is much more up-to-date.

I plan to try it out over the next week or so.  If I do, I’ll let you know how things work out.  If you get there first, please let me know.

Thanks.
– Dave

On Thu, Jul 3, 2014 at 12:56 PM, Josh Houghtelin <josh@findsomehelp.com> wrote:

Nope.  I haven’t.  My WNDR3400v2 hasn’t faltered in it’s operation in the slightest so I haven’t bothered with updates.  Does the new version provide any worthwhile enhancements?  If so, I’ll give it a go.

The only feature I really desired out of the box was some real NFS support instead of that crap FTP resource they provide to allow access to the USB data store.

 Josh Houghtelin

(719) 422-5010

On Thu, Jul 3, 2014 at 1:50 PM, Dave wrote:

Hi Josh,

I just noticed that there is a new version of DD-WRT available for the  WNDR3400v2.  Have you tried it out yet?  I’ll probably try it myself but I wanted to ask you first.

– Dave

 

Submitting Personal Information with[out] SSL

UPDATE:As of September 24 2013 
TeachingChile.com is completely wrapped in SSL.  
~Thank you!

This post is no longer entirely relevant. TeachChile.com has updated their site and wrapped it in SSL. Thanks guys!

https://teachingchile.com

 


URL: http://teachingchile.com/apply_online/machform/view.php?id=6

TeachChile.com has an online application process that requires the submission of quite a bit of personal information including your Passport Details over plain text. Seriously? I almost feel like someone’s playing a prank here.  with SSL and ‘secure websites’ being pretty well understood it’s mind boggling to see websites like this still exist requesting personal information including passport number be submitted via plain text non-secured form data. What’s more amazing is that the lack of SSL is just one of what seems to be a whole ton of security-ignorance which pretty much guarantees anyone submitting data to these guys gets their identity (all their submitted data) jacked.

Wha? Google shows some love.

What makes it more astonishing is the domain was registered in 2005 and ranks #1 for “Teach in Chile” on google(us) and ranks really well for quite a few other keyword phrases. With the potential traffic exceeding thousands of visitors a month – I wonder how many fill out that insecure form?

Source: http://www.semrush.com/info/teachingchile.com

Google gives the page that links to the Non-SSL encrypted page a PageRank of 3.
http://teachingchile.com/to_apply.htm – With all the crazy search listing algorithms and such you’d think Google wouldn’t demonstrate much appreciation for this.

WTF? Guess until you find a stored form!

Saved forms can be easily brute forced!  Now this is so far over the top I’m not sure what to make of it but it offers to let you save the form if you provide an e-mail address like so:

Save Form & Resume Later

and upon saving you are presented with a ‘special link’ which at a quick glance looks like a simple 10 character alphanumeric hash.

Link to permanently saved form.

http://www.teachingchile.com/apply_online/machform/view.php?id=6&mf_resume=f4e2cdde3a

As far as I can tell that is permanent unless they purge the system of old resumable forms at some point. But to drive my point home – all one has to do is generate hash values 10 alphanumeric characters long. The following function ‘should’ generate those hashes.  Can’t say I checked but it is quite that simple.

function generate_random_hash($length=10){
 $chars = '0123456789abcdefghijklmnopqrstuvwxyz'; // Our Hash Building Alphanumeric Soup
 $char_count = 1; // Counter for how many characters our hash is (as it loops and grows)
 while($char_count <= $length){
   // Add random chars from our alphanumeric soup until we hit our target length
   $hash .= substr($chars,rand(0,35),1);
   $char_count++;
 }
 return $hash;
}

Replace the ‘special’ part of the resume URL with generated hash & test for live data.

There can’t be more then 6.something quadrillion hashes possible used to uniquely identify the  saved forms are in their database.  6 Quadrillion is a lot, don’t get me wrong but it’s really not if you break the work down across 1 thousand, 10 thousand or even more computers it becomes pretty easy to pull the task off in a very short period of time, even if approached in a slow enough manner as not to bring their web server down.  I’m digressing though – this isn’t the school of brute-forceology.

Verifiably Exploitable Platform…

If I were to assume TeachChile.com was using Mach Forms (which they are) based on ‘machform’ being in the url or some other simple means then a quick Google search for existing (and very well documented) Mach Form exploits might apply. The latest exploit having been uncovered less than two months ago.  *shakes head* … I’ll just stop there on the topic of exploits.

Just 1 of 1,000+ Other Sites on the Server

With over 1,000 other sites likely hosted at the same IP address (server) I wonder what the odds are that the server itself isn’t entirely compromised already? Source: http://www.reverseip.us/?url=teachchile.com

Running across this situation on a legit website isn’t something that happens anymore. I’m blown away by the seemingly legitimate operation being run on TeachChile.com.  Beyond notifying their contacts I’m not sure what else to do about it.  Should anyone beyond their posted contact be notified of this? Lets hope this application isn’t to teach web development. ^_^

The purpose of this post!

Be very aware of what you’re doing when you release personal information online.  In this case It’s pretty safe to assume that data submitted to TeachChile.com will become property of some nefarious individual. Unless you have some otherwise unobtainable insight in to what happens to your data after you submit it – be cautious.  It doesn’t take much for a web server to fall victim to an automated attack, especially and very specifically DATA because that’s what everything is all about anyway.  Nobody reads the ‘terms of use’ or ‘disclaimers’ anyway (and in many cases, neither do the writers of those things so they don’t identify how your personal information is being securely managed anyway; further – there is no code enforcement to ensure what you’re reading is in fact what happens) so it’s best to assume all the data you’re submitting to a website is going to be retained indefinitely by an individual or a staff that isn’t specifically driven by keeping your data safe. Most techs are extremely trustworthy however often quite lazy.  It doesn’t take much oversight for a whole database, server or better yet a cloud driven limitless data storage asset to become the property of an attacker.  It’s often just a password between the evil attacker and your personal information.  

Don’t EVER submit your Social Security Number or Passport information online.  Just don’t. Perhaps try using your dog or cat’s social security number as a temporary placeholder. ^_^

Migrating web hosts.

I’ve migrated more web hosts then I care to count yet it seems to remain for the most part a mystery and magic trick to migrate a hosted web service ( in this case I’ll stick to general web hosting ) from one server to another without fault or downtime.  In other-words a seamless migration.

I recently migrated a miscellaneous handful of web sites and associated services from GVO (Possibly the worst hosting company I have ever worked with) to Site5 (reasonable, affordable, simple hosting).  Here’s how I went about the seamless transfer.  In this case both hosts had cPanel & WHM however due to the websites on GVO being setup as add-on domains rather then standalone accounts I wasn’t able to just backup the whole picture in a single file and publish it to the new server.  The intent was to properly setup each domain as it’s own account (grouping some domains within a single account when it made sense to do so)

Logically the whole operation breaks down to individual domains or sets of domains most often simply redirecting to a single primary domain.

Information & Resource Gathering

Identify the Domain’s Registrar, owner and management credentials or the person responsible for managing (paying for) the domain name itself. This might be Godaddy, NetworkSolutions, etc. This information is easily sourced through a WhoIs Query.

Identify the Domain’s Name Servers and again, source the management credentials for the Name Server Zone File. Name servers are NS1.SOMETHING.COM and NS2.SOMETHING.COM and so on.  This information can be sourced from the Whois data as well. Managing a domains Zone File is often done within the existing hosting account’s control panel.

Review the domains Zone file.  Grab a copy of it even.  It’ll reveal all the sub-domains, redirects, mail server exchanges & so on.  These are all the items you’ll want to make sure migrate properly if necessary.

zone-file-editor

^^ simple zone file records ^^

Simple (standard?) web hosting can be broken down in to three primary elements.

  1. Files
  2. Database(s)
  3. E-Mail System

Files can usually (simply) be copied from the old server to the new server.  Keeping the file permissions in tact during the transfer is healthy otherwise ya end up with file upload folders like you find in WordPress that no longer allow files to be written to them.  This can get especially complicated in old school LAMP environments where apache runs as a single user. This is however this is not the case in most windows environments.

FTP User Accounts – Sometimes a client will have multiple FTP user accounts for various reasons – don’t overlook them.   Occasionally I’ve run in to situations where clients (or I) have setup automatic backups over ftp/sftp both to and from the web host.  Identifying the things you’re going to break before you break them is always handy.

Databases are another simple acquire -> post situation.  Given phpMyAdmin or MySQLDump just export the databases from one entity and then import in to the next.  I create new username/password combinations for simple websites and CMS systems like WordPress,etc. If for whatever reason you are unable to duplicate the previous hosts database name, user name and password you’ll want to update the configuration files for the website if it did make use of a database resource.   A last note on databases: It’s been a few years since I’ve seen any issues migrating from one database version to another but it’s still worth noting that if you’re migrating between different DB versions (such as MySQL 5.7 and 5.8) – be prepared for problems. Try and upgrade the old database or import and update the data before trying to import between database versions.

Consider purging the junk – It’s easy to simply copy all databases to a new host.  It’s prefer to not let the clutter and waste build on web servers.  I always think in the back of my head that eventually some old unused piece of code or database entry will be used to exploit something so it’s best purged when possible.  In most cases it’s not necessary but if files and content are not being actively used then their only real use becomes the target of exploits.

E-Mail systems are (in my opinion) best operated outside of the web hosting environment but since email hosting comes as part of the standard package of web hosting you’ll often find e-mail accounts and the mail exchange are all on the same machine.  I’ve been a fan of Google Apps for Business for my own e-mail which requires no changes during a host migration – just remember to maintain the same MX entries in the domain’s Zone File.

Setup the New Host

Matching the existing setup – configure the new host environment.  Transfer files, Databases, E-Mail accounts, etc.

To ensure no changes are made to the files and database (CMS content) on the old server while setting up the new be sure to disable any login systems, FTP accounts, etc.

After I think I’ve got everything setup on the new server properly but BEFORE updating any DNS, NS or otherwise making the new host live I add an entry to the Zone File on the old server or Name Server management resource that points to the new host and begin testing the environment with a subdomain of the existing domain. I also add the same detail to the new host just for the sake of consistency.  This is generally enough to test most CMS systems with little trouble.  While I’m making updates to both the old and new host’s Name Server Zone File configurations I also add a subdomain (A) record targeted at the old host.  It’s come in handy a time or two. In this case

  • (A) site5.domain.com  =   IP.Address.of.New.Host
  • (A) gvo.domain.com    =   IP.Address.of.Old.Host

Now – after doing as much testing as possible, get someone else to review what you’ve done.  Generally speaking the customer or client is not a good target for this task.  Just !ping another tech with whom you share system administration tasks, resources, and so on.  A quick review isn’t much to ask and will help make sure that nothing blatantly obvious was overlooked.

While your work is being reviewed setup some simple automated backups.   Don’t rely on your host in any situation as the sole overseer of the content you rely on them to maintain.  I’ve played every role there is from managing simple reseller hosting accounts to maintaining dedicated, collocated & self hosted servers and in the last 12 years I have seen catastrophic failure twice.  The first time – there were no backups outside of the host that crashed and I lost years of resources so no matter how wild or far fetched a failure would have to be to truly atomize everything beyond any capacity to recovery it – it can and will happen.  Prepare for it!

Then….

FLIP THE SWITCH

Update the domain name Zone File primary (A) record to the new server & then update the DNS entries of the domain at the registrar to the new site.

…and test. test. test.  Everything should be green – bring on the client to verify every page is perfect.  Run a link checker, monitor system logs, etc.  I use UptimeRobot as a 3rd party resource to see if my sites kick rocks outside of my own monitoring.  If you seamlessly migrated from one host to another then none of your uptime monitoring tools should have complained.  Further – google webmaster shouldn’t complain either.

I’m sure I skipped a bunch.  I’ll add it as I realize what those things are.