For the past several years I have been able to rely on a near 100% cloud-based development environment. Further, most of my day to day computing was either being synced automatically, or I’d check in or publish what I’d done to an online location.
We had this setup at BraveNewCode, because some of the staff worked remotely, and it just made sense with ample internet bandwidth and the way our products and services operated.
There are many great tools for cloud-based development routines, and with Dropbox, iCloud, Box, etc— easy ways to save all your regular working files to cloud locations and sync them across all of your devices over the web.
Once I moved to Prince Edward County, I didn’t really change anything about my process right away. It wasn’t long however until I started to feel slowed down by all the cloud syncing going on, and also felt it unnecessarily chewing up precious bandwidth just to sync with computers… that were on the same local network 90% of the time.
The first thing I decided to do was shutter my private Girhub repositories. I really only had them for client projects, and there were seldom occasions where I needed to share them. Likely I could have just setup a temporary repo on the web hosting server, or just served it from the Synology in the home office (more on that later).
So I closed the account after making sure all of my repositories were safely on my local machine. I didn’t just want to keep them on just one machine however— I still wanted to be able to sync them with my iMac and my laptop, and also get at them from a remote location if I needed to (like a coffee shop or client office).
So after a bit of research I decided to leverage the capabilities that our Synology already had to do just that. I was aware that it could do it, but I didn’t think previously that it would be that easy to setup and would be significantly better— though as it turns out it was, and is!
Setting up Git is as simple as installing the package from the Synology App Store locates within Disk Station Manager, and then issuing some ssh commands as usual to initialize a repo.
I decided to start fresh with new repos and import my projects— I’d lose my old commit history but for the most part the old ones were just being stored, and the newer ones had little history anyways.
I followed a guide to get my private key setup for easy login and push/pull, and within 15 minutes was already committing to about 10 different repos via my local network, pushing and pulling from my different machines.
One of the immediate things I really enjoyed was just how fast it was to push or pull on the local gigabit network here at home, especially with projects that contain lots of files and in some cases larger files. What would take several minutes to do remotely with github was taking 10 seconds— a big improvement for my local development routine.
After working in web and software for over a decade, I have some RSI issues. As a result I usually move between my laptop and desktop regularly to change it up in terms of posture. Because I switch between them often, I have built my computer setup for years to be a few clicks away from near identical on both machines, so I can migrate between the two with little friction or delay.
So having the Synology making Git syncing effortlessly was great for me for working from home.
Next, I decided to ditch using iCloud and Dropbox for syncing documents and other files I like to keep in sync between my devices, and install Cloud Station Drive on the Synology and my two computers.
With it installed and setup, it works just like Dropbox or iCloud and keeps everything inside a folder in sync identically between my machines, and a copy on the Synology. Because it’s on my home network, it’s blazing fast and is not consuming any internet bandwidth to do so. When I’m out I usually pause it, but I can have it sync remotely too if I wanted. It like having a private, personal cloud service.
And because of its speed in particular, I’m syncing my desktop, documents, downloads and a main Cloud Station folder for a bunch of other things- totaling over 200GB of stuff– absolutely free with no fees.
It’s been working really great, and has a built in backup versioning system in case I need it.
So now I’m completely using the Synology as my cloud storehouse, development centre, and soon will host local websites and databases on it to further leverage its capabilities.
Synology really has been building incredible backup & data protection network attached storage for quite some time— and I don’t think they’re given enough credit for the software that powers it, Disk Station Manager. It’s been rock solid and reliable for me, is packed with features and is essentially a full fledged home server if you want it to be, capable of doing a ton of different tasks, replacing the need for full fledged additional computer to take on the roll.
They’re very affordable too, with really great models in the $400-$800 price range— a bargain compared to similarly spec’d computers that could be outfitted to perform similarly.
And living and working rurally in technology has revealed its strength even more to me.
If you’re in the market for home or small office storage and server capabilities, give Synology a look.
Roz and I knew when we moved to Prince Edward County that internet access was going to be… different. Depending on the area where you live throughout the county (it can take an hour to traverse the whole island) there are different providers, speeds, and connection types. The internet landscape here is a patchwork just like its farmlands— and there’s no easy or simple guide to what services are offered where, and what’s best depending on where you are.
Determining what internet services were available to us initially was just checking with our agent to inquire with the current owner about what they had. It appeared that stated they had Bell internet via DSL, and I saw the landline demarcation point on the house.
So when we moved in I immediately inquired with Bell… and after 15 minutes on the phone and the agent trying different variations of our address… they determined that services were not available to us!
This is pretty common in rural areas, where address databases (depending on where they’re from) may not be quite up to date. Having recently completed a website overhaul for a rural internet service provider (Execulink), I learned about how they qualify addresses, and things should improve in the next few years as service providers move away from Canada Post & Google databases and lookups, and over to modern geolocation-based qualification, which should greatly improve the accuracy of determining services available.
Out of date address databases issues coupled with the fact that Prince Edward County has become much more popular over the past 5-10 years, with land severances and new home builds— and it’s pretty likely that when you call a provider they may not be able to verify they can service you, or tell you they can offer you X when actually X and Y are really available.
I had to call Bell back three times, trying different approaches with the sales agents to see if they could verify the service address. It wasn’t until I actually gave them the name of the former owner that they could bring it up! After that, they sent out an installer and after 2 weeks of moving in and trying to get internet going— we had an install date (the next week!).
I also inquired with neighbours about what service they had. Some were using an over-the-air LTE Fixed Wireless service from Xplornet, others were using Bell DSL reseller service from providers like KOS & Teksavvy. Unfortunately there’s no cable available where we live, though Cogeco services homes just a couple KM’s away.
Most Ontario rural communities currently have a patchwork of Fixed Wireless LTE, DSL, Cable, FTTN (fibre to the node), and (rarely) dedicated fibre service.**
Some folks choose to use a mobile LTE package from a cell carrier instead, however this solution is usually expensive, and often only decent if you’re getting great cell service (live near a tower) and don’t require much bandwidth (50 GB or less), which isn’t practical for anyone working in the digital economy.
When the installer was finished and we finally had internet going, it was a small triumph. After a self-satisfied beer when they left and a speedtest on a hot summer day in July… the reality— moving to a rural community and facing what was essentially akin to going back in time 10-15 years in service quality— started to sink in.
10 down, 1 up. Yikes. We had just left 120 down, 15 up in the city, with Cogeco… and they were just rolling out fibre in our building. We could have had 1Gb down and 150Mbps up!
Being that I work in web and software development, and my wife Roz works in video production, it wasn’t long before I started to investigate how to improve the situation without breaking the bank.
After a few weeks with the DSL service, it was clear that we were going to suffer big time without another solution. We have around 20 internet connected devices— phones, laptops, computers, network attached storage, game consoles, smart TVs, nest thermostat, wifi light switches… many of which don’t use lots of bandwidth at once but do sip a little each at a fairly constant rate. As a result, anything more than light use caused problems— stuttering streaming video, slow downloads, being kicked off game servers, broken video chats.
I began researching solutions, which essentially came down to these two options:
Number 2 was difficult to research, and I couldn’t get through to any non-consumer oriented office about it with Bell and a few other providers. It seems you need to pursue it via the commercial route. I estimated based on research to cost around $5-$12 thousand to have installed, and a monthly cost of around $500 thereafter…. so… out of the question for now.
Number 1, seemed like a much more reasonable approach. At first I thought I might just end up with a second Bell connection, and then I would split some devices on one network, and the rest on the other. After some more research I learned that you can have Bell bond two connections right in the modem, which explains why there’s 2 ports on the back of most modem/routers they provide.
Through this research I learned of a device which I’d previously heard of but only thought existed to serve the commercial space (and this would be cost prohibitive to buy for my purposes): a Load Balancing Router.
Most of these devices can support 3+ internet connections. These devices can connect all of your devices seamlessly behind the multiple internet connections, and balance the load of traffic in and out spread across the connections. They don’t typically have WiFi built-in, so you’ll need a WiFi router plugged in to it (or another router connected to the load balancer) to connect all your wireless devices behind the load balancer.
There are a variety of ways you can distribute traffic with a load balancer, from shaping it based on devices (MAC addresses), type of service (e.g. mail vs. Web browsing vs. File downloading) and more.
I was now excited that it might be possible to get decent internet that could do for our requirements, for a few reasons:
So… great! I was going to get a Load Balancing Router, and get a second internet connection and then network the hell out of things to eek out performance wherever I could.
READ PART 2 HERE: Rural Internet Secrets: Part II
**If I have some spare time I would love the build a mapping page where county residents could enter what service they have from what providers so you could better see the service networks available— and possibly use that to encourage providers to expand.