Securing and Backing Up Your Data

I’m sure you’ve had this experience: you flip open your laptop and it doesn’t boot up. You drop your phone and it won’t turn on.

You pause for a second… “is any of my stuff backed up?”

It wasn’t until this happened to me (my hard drive almost fried) that I started taking backups seriously. Security and digital backups are one of those “important not urgent” things it’s easy to forget about until it’s too late.

Here’s my take on how to think about backing up your digital world and making sure others can’t get access to it.

Security & Privacy

Assume all of your private information—SSN, address, password, phone, etc—will be public at some point. It’s only going to get easier to hack into large systems.

Why? Here’s one example.

Each website requires 100s or 1,000s of software packages downloaded from across the internet in order to run. If just one of them gets hacked or has a vulnerability, it’s relatively easy for a hacker to scan the internet for all websites using that package and attempt to break-in. This sort of hack happens often. 

Still think I’m crazy? Plug your email into this site.

The good news is, if you setup your digital world correctly you can easily be immune to these hacks:

Make sure any single password isn’t that important Make it challenging or impossible for anyone to login with just a password Setup automated alerts when someone else is acting on your behalf Backing up Your Digital World

Thanks to Time Machine and WiFi iPhone backups, it’s gotten much easier to backup your data. For most folks, a time machine backup against an external disk is probably enough. 

However, my computer contained everything to run my business for many years so I took my backup game up a notch.

Here’s how I think about backups:

You should assume something bad is going to happen. Your computer will break, your external hard drives will die, a virus will wipe your disks, or someone will steal your computer. Hopefully, none of these things ever happen, but planning as if they will happen is the safest approach.  The time it would take to recreate a single important file is more costly than ~decade of storage costs. In other words, don’t try to save a couple bucks by not backing up your files.  Disk space is cheap. Wether cloud or external drive space, it’s worth paying another $100/year to have enough space to never worry or think about losing data. Eliminate a single point of failure. A time machine backup is great, but what if someone comes in to your house and steals your electronics? Or a baby spills water all over your laptop and hard drive? I want my data to be safe if my entire house burns down.  Make it easy and automated. I want to set my backup strategy up once and never think about it again. My Toolbox

So, how do we get this done? First, here are the tools you’ll need:

1Password. This is, by far, the most important part of this toolbox. I’ve used 1Password for many years and it’s awesome. Gets better without fail year. It’s a paid product, but it’s worth every penny. I store passwords, credit cards, personal data, etc in this. Yes, it’s safe to store all of your data in your 1Password vault—lots of articles out there explaining why.  Arq. Backups data on your computer (and other attached devices) to cloud storage. Supports lots of different storage options, including Amazon Drive. It’s a paid product, but it’s not a subscription service. Amazon Drive. My prime membership has free photo storage, and their storage tiers beyond photos are very cheap ($60/year for 1TB). I don’t use it for document storage/sharing, but it’s great for backing up data. 1TB external hard drives for my wife and I’s computer for Time Machine. Those that don’t have an external power source are more convenient.  Google Drive. I use this for any non-media documents.  GitHub. For code. I store all of my projects in ~/Projects. For the technically inclined, documenting your configuration in a dotfiles repo is a great way to backup config or preferences that may be skipped by Arq. Building Your Backup System

Next, you’ll want to setup your backup system:

Setup the Amazon Photos app on your phone. Setup the app to automatically backup all photos and movies.  Setup WiFi iCloud backups on your phone. Disable photo backups via iCloud to avoid running out of space super quickly.  Don’t try to organize old files, and keep your workspace (i.e. Desktop, Download, Document, etc folders) organized. If you find old files are piling up, group them into a “May 2020 Documents” folder and throw them on an Archive folder on your external hard drive (if you need more space on your primary machine) Setup DropBox & Google Drive sync apps. If you have multiple Google Drives, use InSync to sync them to different folders. Setup Time Machine against an external hard drive.  Setup Arq: Add your Google Drive & DropBox folders. Add places you put files outside of cloud storage tools. For me, that’s Desktop & Downloads. I put all of my GitHub code in ~/Projects. This approach works for any non-media “project” files—put them all in a common directly and back the whole thing up. Add any external drive folders that contain files that aren’t on your machine’s drive. Connect to a backup destination. I use Amazon Drive. Setup a daily automatic backup. 

Here’s what my Arq configuration looks like:


Some notes & caveats:

I don’t have any on-disk music or video files. This approach should still work if you have a lot of media, but you’ll want to think about what videos & audio you really need to backup to Amazon Drive to limit your backup costs. Keeping an external drive connected to your computer isn’t really practical. I haven’t connected my external drive to my laptop in a couple months, which isn’t great. The easy way around this is buying a drive that connects to the network. Securing Your Data

Now it’s time to secure your data! Here’s what to do:

Setup 1Password on your phone and computer. Store every password you ever use here. 1Password will magically identify passwords that are insecure or have been leaked out to hackers. This makes it really easy to incrementally eliminate insecure passwords.  Use a unique password for every login. Do not use the same password everywhere. 1Password will generate a beautifully random password for you automatically if you install the Chrome/Safari extensions (which you should do)! Setup Two Factor Authentication (2FA) on all important sites. This includes Google, Amazon, banks, investment accounts, etc. Yes, it takes a couple minutes to setup, but it makes it much more challenging for someone to hack your account. 1Password can actually be your 2FA device (as opposed to using your phone). If you add a “one time password” field to the 1Password entry, a scanner will pop-up on screen. You can put that scanner over the QR code and boom—your 2FA codes will exist in 1Password.  Review your Google, GitHub, DropBox, etc account connections each year (don’t forget!). It’s very easy to grant permissions to a 3rd party app which are larger in scope than you thought when you clicked “ok”. Printout your 1Password password each year and put it somewhere super safe. This protects against 1Password going down, or something else horrible happening to your computer or the internet. Monitor & Lock Your Credit

It doesn’t hurt to lock your credit and setup some simple monitoring. Here are the three credit unions where you’ll need to lock your credit:

TransUnion Equifax Experian

I’ve found that CreditKarma is a great service for monitoring your credit score & activity.

As with most of these posts, they are mainly written so I can document and improve my thinking. If you have any critiques or ideas, I’d love to hear them.

Continue Reading

Disruption Ahead: The Sharing Economy Revolution

Yesterday I used an Uber for the first time.

I’m not in the city without a car a lot: I rarely have the need for a taxi. However, yesterday I needed to get across town as soon as possible: it was extremely cold and the buddy I was meeting only had two hours before he had to go. I didn’t want to waste 40 minutes of it in a bus.

As I pulled up Google Maps a ad for Uber appeared at the bottom of the public transportation options. It promised 1/4 of the travel time. In context ads filling a real need I’m experiencing right now. Great job Google & Uber.

What I Learned From the Driver

I setup the app (awesome on-boarding experience; the app is great) and hoped in someone’s car within minutes. I have a curious mind and I love engaging with strangers, naturally I struck up a conversation. Here are some things I learned:

He didn’t own the car. He rents the car for a $100/day. Interesting that business structures are forming to eliminate barriers to entry. You make more money being a traditional taxi driver. Uber takes 20%, traditional taxi service handles the complexity of owning a car and takes a flat fee. Uber grows the network by aggressively giving away free rides. Simple but effective strategy. All of the operational complexity is handled by software. Credit card processing, taxes, 1099s, customer support, dynamic cost adjustment based on demand, etc. Creates security for both parties: there is no chance of me not paying and Uber ensures that the driver is taking the most efficient route. It costs more, but it’s worth it. This is really a situation where value is added on both sides.

My mind was blown by the whole experience, not because of Uber, but because of what could come down the pike.

When Logistics is Handled by Software

Logistically tasks that suck up time are a pet peeve. A couple months ago I was changing the oil on my car and came up with this idea:

Business idea: an oil change company that does pick up and drop off. I'll be your first customer. 🙂

— Michael Bianco (@mike_bianco) November 18, 2014

What if the solve wasn’t a company, but instead of a service which connected you with local oil change service providers who came to your house to change your oil?

What if a neighbor came from down the street to cook your meal instead of going out to dinner? What if you just purchased their leftovers?

What if there was a marketplace for any service that eliminated the coordination cost normally associated a service businesses?

When the challenge of connecting with someone who wants the expertise you have is eliminated there is going to be more service providers, more purchases, lower prices, and more competition.

Continue Reading

Cloud Based DID With a POTS Based Asterisk System

A while back I wrote about my experience setting up a business phone system with Asterisk, Polycom, and POTS. This system has been working fine over the last year, I’ve only had to dive in once or twice to fix a couple issues (which I’ll detail in a future post).

However, recently someone using the phone system needed a Direct Inward Dial (DID) to their phone. I couldn’t find any concise walkthrough about how to set this up, so I’ve written down my processing in figuring this out.

I knew Asterisk supported DID, and I found a guide that walked you through setting one up.

If you using POTS for your calls and not a SIP trunk, DID gets tricky really fast. Some information mentioned voltage, analog signaling, and other tricks; having to work with your telephone provider; and PRI, number pulsing, and reversing polarity on analog lines. Both solutions seemed very messy and time consuming to me (working with my telephone company – Verizon – has only been painful). My hunch was that the lines coming into our building weren’t clean and we would have issues with the connections (the contractor who set up the lines mentioned something about the lines being dirty and not being able to fix them). I’m not a VOIP professional – nor do I want to be – line voltage and electrical signaling is the last thing I want to deal with.

Another possibility would of been to dedicate a physical line to the specific phone off the PBX and take that number off the hunt group. I could create an inbound route in Asterisk linking the dedicated line to the extension off the PBX. If I used this method, I would of had to take the line off the hunt group because depending on the number of people calling in the line might be picked up and then anyone using the DID will get a busy signal.

However, this would require buying more hardware and ordering another POTS line, something I didn’t want to do (mainly because of the time involved).

At this point I started to regret going with POTS over SIP trunks for our phone provider. However, POTS offered significant savings for us. Our lines through Verizon are relatively cheap (somewhere around ~$25/line) and came with unlimited minutes. When I looked at cloud phone providers they were charging $0.02 – 0.03 a minute. With our call volume this would of ended up costing us a good chunk more per month. Taking a look at cloud phone providers now it looks like the landscape has changed and cloud and land line providers seem to be much closer to price parity.

I figured it must be possible to forward a call over the PSTN to a SIP. Twilio definitely doesn’t allow connection to SIP endpoints right now (although, they have a service in beta). With Tropo it seemed like it was possible, but I couldn’t test an outbound phone connection without going through a verification process. Additionally, outbound calls ran $0.03/minute which seemed high.

After some googling I found DID Logic which on their front page mentioned call forwarding to a SIP URI – great. The company seems a bit  shady (“bought a theme + plugged in some standard marketing copy” sort of site), but since I could test things out for a couple bucks I went ahead and tried it out – it works great.

I grabbed a number off of DID Logic for ~$1/month + $0.007/minute and forwarded it to my PBX.

However, the Asterisk box was not setup to pick up anonymous SIP calls and my SonicWall firewall was not setup to expose SIP ports to the PBX.

Setting up Asterisk to Support SIP Call Forwarding From DID Logic

Make sure “Allow Anonymous InBound SIP Calls?” is enabled in the General Settings on your FreePBX admin (a SIP call from DID Logic will be anonymous).

Then add an InBound DID number in the extension’s control panel. With this DID assigned you can now call your phone using the SIP URI “”. Most likely your PBX is behind a firewall, you’ll have to configure your firewall + NAT settings to forward SIP traffic to the PBX.

Sonic Wall SIP Traffic Forwarding Setup

You need to have both a firewall rule and a NAT routing policy for SIP traffic to work on a Sonic Wall device. Check out this comment for information about UDP timeout settings and some other general information about SonicWall Config.

Your Firewall rule should look something like:

WAN –> LAN Source: Any Destination: Any Service: VOIP Allow

Your Firewall rule should nothandle the routing to a specific IP address even though there is an option for it. The routing is handled via your NAT rule (General > Nat Policies) and should look like:

Original Source: Any Translated Source: Original Original Destination: Any Translated Destination: PBX Original Service: VOIP Translated Service: Original

The PBX destination won’t exist by default. To create that destination, go to Firewall > Address Objects and add one referencing your PBX on your LAN.

Also, ensure that “Enable consistent NAT” is enabled under VOIP > Settings. Do not enable any of the transform options.

Continue Reading

Billings Pro Touch Server & Client Syncing Issues

Quick tip for anyone having issues with getting their iPhone’s Billings Pro app to sync with a local Billings Pro server:

I recently grabbed a Asus RT-N16 and flashed it with DD-WRT. It was working great until I was fiddling with some of the wireless settings and accidentally reset the router.

After reconfiguring the router, my iPhone with Billings Pro Touch would not sync with the local Billings Pro Server. For some reason it seemed that the network tab on the server admin GUI wasn’t picking up my lastest public IP and/or reporting it to the switchboard service correctly. To fix your reported public IP: log out of the switchboard, click advanced, manually set your public IP, and login to switchboard again.

However, my phone would still not connect to my local server. It seemed like it cached the old public IP and wouldn’t update its connection details. The iPhone UI is sparse and doesn’t include any way to manually update connection details. However if you navigate to Settings –> Server, click on your switchboard account, and navigate to password field the “done” button will appear in the iPhone UI. If you click done it seems to pull the updated switchboard login details and it should sync without a problem.

Not sure if this made a difference, but I manually setup port forwarding for 60525-60527 and 7113-7118.

Continue Reading

MacRuby 0.12, RVM, and Gem Installation Problems

I recently jumped back into a MacRuby project that I haven’t touched in a while. I upgraded to the latest MacRuby 0.12, installed the necessary gems via macgem install, and was presented with this error:[code]Segmentation fault: 11[/code]

Since I started this project my ruby setup had drastically changed: RVM, custom irbrc, and lots of other tools that I’ve found essential for productive rails development had been installed. I noticed that macgem list --local returned the list of gems needed for my rails project.

Running env from the command line revealed that GEM_HOME and GEM_PATH were set explicitly in my bash env, a result of having RVM installed and a non-system ruby set as default. These two environment variables were causing macgem to look for and install gems in the rvm gem directory. To fix the issue run these two commands in your shell and then run your necessary macgem install commands:[code]unset GEM_PATH unset GEM_HOME[/code]

Continue Reading

Product Pricing in a Zero Marginal Cost Distribution Environment

Jarrod Drysdale on digital product pricing:

Our strategies were very different. Sacha wrote a book and priced it relative to the cost of other books, which is the strategy just about everyone follows. Instead of that, I wrote a book and priced it based on the value it provides.

Choosing a pricing strategy based on competition is a natural approach, but also a flawed one. Price competition implies scarcity—supply and demand market forces. There is no scarcity for ebooks because digital files are replicated practically for free.

Seth Godin has mentioned this before: there seems to be a ‘race to the bottom’ effect with a lot of eBooks, but many are doing fairly well with pricing way above the competition if they are in a market with scarce competition. Of course this is nothing new – small supply relative to demand results in a above market price.

If you not planning on growing a business or establishing a brand (including your own ‘personal brand’ – your value in the marketplace) then selling a one-off book (or any sort of digital content) by estimating the intersection of supply and demand curves might work.

However, every product has some of auxiliary asset whose value is increased or decreased depending on a product is priced, designed and released.

Mailing list growth. Establishment of a respected voice in a niche market or field. Growth of enthusiastic fans. Possibility of a future acquisition.

All of these intangible assets are not easily valued because in most cases they are dependent on the future. However, they have a real value and possible growth in any of these assets can effect the short term pricing of a product or service. I think this is what makes digital good pricing challenging – why some books are on sale for $3.99, some free, and some less that 150 pages and $50. I don’t think there is ever going to be any one model that works – when you can slice and dice pricing into many different facets the possibilities are endless.

Continue Reading

Setting Up QuickBooks on Windows XP Professional for Multi-user Environment

Recently I was involved in a project moving a company’s files from a old 2003 windows exchange server to a mac mini server setup. The first setup was to move from Exchange’s email and calendaring to Google Apps. After that step was complete we moved the shared files drive over to thunderbolt RAID 5 storage attached to a mac mini server device. The transition was pretty smooth, however there was one problem which wasted a significant amount of time.

The accountant’s in the organization use QuickBooks 2010 for all accounting purposes. Moving to a hosted solution was not an option, and they needed multi-user access to the file (2-3 people could be working on the same quickbooks file at any given moment). Our old 2003 server had quickbooks database server installed which seemed to work fine. Unfortunately, if you are not an ‘enterprise’ quickbooks customer there is no linux version of the database server available. There was an old windows box lying around (fairly fast: dual core 2.8ghz, 3GB ram) that would be a perfect fit (or so I thought…) for a quickbooks server. Wiped the box, installed Windows XP with all updates, removed all crapware, installed quickbooks database software, but had significant trouble getting quickbooks database server to work correctly.

I ran the QuickBooks Network Diagnostic tool, but it did not report any errors. When opening the QuickBooks file from a client machine in multi-user machine the login prompt would come up fine, but after entering the correct login information it would time out with an error message stating a connection issue (H202) and suggesting using an ‘alternative’ method (there was a significant delay in between initiating login and getting a response). Note that QuickBooks at this stage would correctly report an incorrect password.

The network setup in the location where this was occurring had a local server running DNS. The QuickBooks server had a static IP set.

Here are some general notes on setting up QuickBooks:

Some mentioned that anti-virus software on the client machine causes slow operation. This didn’t seem to make in a difference in my case. Tried turning firewall off on server + client machines: no difference (proper port settings were already in place) Pulling data off of the shared QuickBooks folder on the XP machine wasn’t bad: 15MB/sec on a badly engineered 10/100 network (there are 5-port 10/100 switches in probably 5-10 locations around the office) Opening the QuickBooks file in single-user mode from a client machine worked fine Launching QuickBooks 2010 on the server and opening the file in multi-user mode, then opening the file from client machines worked fine as well The significant delay between the login screen and the error messages pointed to some sort of look-up timeout, but given that file access to the machine was fine, this didn’t make a ton of sense. However, this seemed to be part of the issue. It is important that the daemon process for QuickBooks Database Server is part of the administration group On another note: some great information on backing up QuickBooks How to schedule backups of QuickBooks from the QB Pro interface

What finally fixed the problem was adding the computer-name (aka server name or BIOS name) to the hosts file. Opening up quickbooks is still painfully slow, but at this point it works.

Update 02/24/2012

After moving (and completely eliminating the windows server from the network) problems ended up cropping up again. QuickBooks seemed to rely on the WINS (a crappy windows replacement for DNS) server to some extent. After editing the lmhosts file (same location as hosts file in windows) and manually adding the NetBIOS entries everything seemed to work. Note that there is not a lmhosts file by default! There is a lmhosts.sam (should for sample). To active the file you have to remove the extension (watch out for hidden extensions). On the machines that are using quickbooks both the hosts and lmhosts file have manual entries for the QB server. Not sure if this is necessary, but it worked.

Update 08/25/2012

The Windows XP QuickBooks Server will seemingly disappear from the network. You can still ping the machine, still access it directly via mouse + keyboard, but accessing the file system or connecting to the quickbooks database will cease to work. I don’t have enough windows knowledge to know if this is a windows bug or not, but I’ve scheduled an automated restart twice a week and this seems to fix the issue.

Continue Reading

Migrating Rules From Microsoft Exchange to Google Apps

Recently I was part of transitioning the email system of a 20+ employee business off of a Microsoft Exchange 2003 server to Google Apps. Moving close to half a million emails to a new email service was a big decision. The transition tools that Google has in place are pretty good, albeit slow for that many emails, Google throttles email transfer to one each second after the first 500. However, the one piece that was missing was a good tool to transition outlook server rules. Many employees used those rules extensively and many had 50-100 rules. Outlook does not have any method in place for extracting those rules. There is no built in way to getting any sort of list or descriptions of the rules, if one wanted to transition the rules manually they would have to click on a rule, look at the pop-up window, and recreate the rules in Gmail using the filters functionality – repeating this two step process for each rule. Horrible.

This would waste many hours of valuable time so I started hunting for a better solution. There is an API in Outlook 2007 or higher that enabled access to rules. There isn’t much example code available, and to my surprise I couldn’t find any VB script to export a CSV of all the rules associated with an outlook account! I hacked together a really rough VB script which exports Outlook rules (only one rule type right now, thats all I needed for my use case) as a CSV and then wrote a small ruby script to generate a XML doc of the rules for import via Gmail’s import / export available through Gmail labs. It works fairly well assuming you have an updated version of Outlook 2007 or higher.

Google Apps Transition Notes The server migration tool pulled in some calendar events that employees claimed they deleted long ago. The Google Mail Uploader application for Mac is not consistent. It wouldn’t pickup mail on some computers. Doesn’t handle folder hierarchy (flattens everything). Use the server migration tool instead. folder doesn’t update folder’s unread count immediately. This might be an isolated issue with Lion. I had a problem with one Mac machine (10.6) where the inbox would randomly appear blank. Clearing all support / cache files and adding the mailbox with message + attachment cache disabled fixed the issue (after mail downloaded I enabled cache again). Gmail doesn’t seem to handle lots of folders (labels) well. seems to be a lot slower with multiple folders. Hiding the automatic All Mail, Misc, Follow-up, etc folders was helpful for those who were not familiar with gmail. Changing some of the local settings on makes Gmail play a bit nicer. Still can’t find a good solution to allowing a user that is an administrator of another user’s calendar to create an event with the organizer being marked as the calendar’s creator. Use case: administrative assistant managing an executive’s calendar.

Continue Reading

MacRuby Deployment + Load Order

After reading the official MacRuby docs on deployment, I read over this guide. Although the deployment build seemed to be working fine on my local machine when I dropped it on my laptop with a standard Lion install it crashed, claiming that there was an defined constant – but that constant was a class. How could it be undefined if it ran fine locally?

Looking into it a bit more the class that was undefined was being used as a superclass for another ruby class. Taking a look at rb_main.rb revealed that there is no specific load order. Since the load order was undefined, the class requiring the other ruby class as a superclass was being loaded before the superclass was loaded. I ended up tweaking the rb_main.rb file to allow for a manual load order, followed by the standard automatic load.

[code lang=”ruby”] # Loading the Cocoa framework. If you need to load more frameworks, you can # do that here too. framework ‘Cocoa’

# Loading all the Ruby project files.

# manual load allows up to specify the load order for some of the classes manualLoad = [“VTiTunesHeader”] for file in manualLoad require file end

manualLoad < < File.basename(__FILE__, File.extname(__FILE__))

# Auto load the direct of the files in the dir dir_path = NSBundle.mainBundle.resourcePath Dir.glob(File.join(dir_path, ‘*.{rb,rbo}’)).map { |x| File.basename(x, File.extname(x)) }.uniq.each do |path| if not manualLoad.include? path require(path) end end

# Starting the Cocoa main loop. NSApplicationMain(0, nil) [/code]

You can grab the gist here.

Cocoa Resources

Some Cocoa libraries / snippet repos that I found during my latest dev session.

jrc’s Cocoa Snippets JSON Kit – awesome objc JSON library Cocoa Objects – useful index of pluggable Cocoa code AQToolkit – great collection of useful pluggable categories / classes. Contains useful NSFileManager category for managing temporary files. NSString category for generating a query string from a NSDictionary ASIHTTPRequest – comprehensive HTTP request library Random Tidbits Although old news to most, you can grab the the last n bytes of a file using tail -c. Very useful for cutting down on the size of large text log files. I pulled the build versioning code out from a project I was working on. Take a look at this build numbering gist, provides source to pull version number from git or svn and write it in your Info.plist The Ruby logging class is more robust than the Log4r class and the built in logger class. attr_accessor :variable makes a instance variable Key Value Coding compliant. Just set @variable in your initializer. Awesome side-by-side reference sheet for PHP, Ruby, Perl, and Python. Handy reference to python to ruby conversion. Obj-c blocks in MacRuby Although you can `macgem install json`, macruby comes with a json library built in that seems to have tweaks for deployment. Don’t install the json gem The Open3 Ruby library does not return subprocess status correctly when using MacRuby Online version of “MacRuby: The Definitive Guide” PyObjc on Lion is dead. Although you might get an application to run, there are so many bugs it really isn’t usable for production Although macrubyd exists, it doesn’t seem to work with full-fledged Cocoa + MacRuby apps. There isn’t any Xcode integration. Ruby-Debug also doesn’t seem to be compatible with MacRuby. Bottom line: no strong debugging tools for MacRuby… yet. The “throw your dotfiles on github” trend has been an interested learning experience for me

Continue Reading