Most ISPs do not charge a flat fee for large connections these days, but use a usage model instead. The usage model most used is the "95th Percentile" calculation.
* Periodic samples (usually every 5 minutes) are taken.
* At the end of the billing period (1 month), the top 5% of the samples are dropped and the next highest value is used as the amount charged.
If there are lots and lots of distributed computing clients, the sum usage increase might be enough to increase the bandwidth charges.
Based on how distributed computing clients work, this seems a bit unlikely. However, DC clients were each grabbing large amounts of data from the master system periodically, with enough systems this could cause a large spike in bandwidth utilization. Enough spikes and the 95th percentile kicks in, resulting in extra charges.
It's very possible that the University could blame David McOwen for a substantial amount of their monthly bandwidth charges. The fact that this occurred in December actually makes it worse for David, not better, as the delta between the non-DCC and DCC usage would be even greater.
Unfortunately, the article doesn't say how much total bandwidth the university has installed. This would allow you to calculate the maximum possible cost involved.