Tag Archives: Work

Protecting the Open Internet

Recently I was asked by my employer, Ensighten, to draft a blog post about net neutrality.  They were able to use the draft to guide their post on the issue.  With their permission, I’m posting the original here as well:

On July 14th Ensighten submitted a public comment on FCC Proceeding 14-28, “Protecting and Promoting the Open Internet” in favor of net neutrality. The FCC’s current proposal for net neutrality allows for the opening of “fast lanes”. Content providers would purchase priority access to the network and would be able to deliver content faster than other services that haven’t purchased priority access. In effect, all other bandwidth is in the “slow” lane. This approach raises questions about the enforcement of fast and slow lanes, and stifles innovation by allowing existing properties to buy their way into faster access.

In the FCC’s model, each home Internet service provider would have the option of providing fast-lane service, but in order to provide a consistent experience to all users, a content provider would need to pay a fee to each of these ISPs. In addition to the initial outlay, this adds ongoing overhead due to maintaining contracts with each provider for access. Some services may be able to subsist on the slower lanes, but even a second of extra load time can reduce customer conversions, meaning many businesses will be forced to purchase this priority service.

Under a true net neutrality model, all information would be treated equally on the wire. The ability to serve content wouldn’t be affected by others competing for the same users. In this model the Internet behaves more like a phone service, where calls aren’t prioritized, but rather handled as they come in. The ability to serve content is still limited to the speed of the service purchased, but cannot be delayed in-transit to the user.

For Ensighten, the net neutrality model allows us to deliver analytics tags to our customer’s pages quickly and consistently.  Knowing that our data can’t be slowed in transit allows us to state with confidence that we can provide the fastest Tag Delivery Network for our users.  In turn, our customers will benefit by knowing our tag delivery will always be as fast as possible, leading to consistent load times and page performance.

The other benefits of net neutrality apply equally to Ensighten, and our customers. Both will save money and manpower from creating agreements with any number of ISPs. Our customers and their end users will see consistent quality of access to the services they use.  Based on Ensighten’s comment and the comments of the million-plus businesses and users who rely on an open internet, we hope the FCC will see this issue the same way.

Previous Work: CSR Interface and Dashboard for MetroLINK (2012)

Static demo (w/ dummy data)

Source code (zip)

Before I begin I want to say that the code here, along with any other code I have posted, is posted with the expressed permission of the company that originally had me write it.

This is something I’ve been anxious to write about, I’m really happy with the results and it helped out quite a bit at MetroLINK.  It’s a web dashboard to show the call data from Metro’s Customer Service Representatives (CSRs).  It pulls data from an Access database through ODBC, and uses PHP to generate the main dashboard as well as the drill-down pages.  The graphs are provided by the excellent Flot javascript library.

Last year it was decided that more information had to be gathered in regards to how many calls the CSRs were taking, and what those calls were about.  Our phone system didn’t support anything beyond basic logging, so until the system could be upgraded something needed to be put in place that would allow the CSRs to track their own calls.  I opted for Access because it was a database system others were already familiar with, and I could make an interface easily enough by using VBA and Access’ own forms.  We saw results almost immediately, and had a much better insight into what the CSRs were doing.

Just using the Access built-in reporting functionality was great, but it was missing the “live” element.  That’s when I decided to start working on this in my spare time.  I discussed what we would need on a dashboard with my co-workers, and then set out to make it happen.

I had some hesitation when I was figuring out how to get the data from the Access file to PHP.  The same file was being used by the CSRs to input this same data, so I had been worried about the file being locked.  The Access forms were already split to avoid this, but I didn’t know how a connection from PHP  would behave.  With ODBC setting up the connection to the Access file was a breeze, and I was pleased to find out it handled multiple connections without issue.  On top of that I could specify that the connection was read-only, providing some security as well.

When I was designing the dashboard I wanted it to have a similar appearance to the public-facing website gogreenmetro.com, so I borrowed the color scheme and title image.  While the data was only changing on each refresh (more on that later) I wanted the dashboard to appear to have activity in it.  To get to this goal I included several hover-over effects and made things respond in useful ways where I could.  Primarily in the graphs and tables where you can highlight parts and get specific information about a point or pie piece.  While it isn’t perfect, it gives the dashboard a little more polish and makes it feel more “alive” than the same page without those elements.

After the main dashboard was completed I started working on the drill-down pages.  They can both be accessed by clicking the numbers for total number of calls and number of unresolved on the main page.  The unresolved drill-down is just a larger version of the breakdown by CSR, which is just building a table.  But the number of calls drill-down introduced some challenges.

On the main page I used the hour function to group calls by hour, and sent that to Flot.  It was simple, and worked for the purposes of the graph.  Moving on to the more advanced graphs though, that method was no longer going to work.  I had to use Flot’s time support, which means I needed to get milliseconds from the Unix Epoch, as that’s JavaScript’s native time format.  None of this was too challenging until timezones entered the picture.  Using Datediff to give me seconds from epoch gave me a sort of “false UTC” that treats the times as if there was no time zone offset.  Since the data would always be correct in the actual database and the presentation wasn’t affected, I saw no problems with this.  It actually encourages it in the Flot API instructions.

Until I checked the tooltips.  JavaScript corrects for time zone when you start using date functions, so all my times were coming in a few hours off.  PHP provides a great way to get the local time zone offset in seconds, so I used that to correct the difference by changing it before the page was rendered.  A side effect of this is that the times change depending on where the page is viewed, so 3pm Central would be 1pm Pacific and so on.  In this context it would probably be a bug, but in other contexts it would be a feature.

In all, this project taught me a lot.  It reinforced my knowledge of things like JSON, HTML/CSS, and how to implement designs to work cross-browser.  It gave me a chance to use PHP for a project, and I learned about it in the process.  Finally, it also gave me a chance to really use Flot and jQuery.  Being able to bring all these things together in one consistent project was a great experience.