ApacheCon 2003

by Adam Trachtenberg

Related link: http://www.apachecon.com



I enjoyed my first time at ApacheCon. As usual, it's the people who make
the conference, even more than the location or the sessions.



Las Vegas can be fun, but it doesn't do much for me as a conference
destination. There's really no place to go unless you want to gamble or
shop. But gambling is no fun alone (nobody else was stupid enough to
waste their money playing blackjack besides me) and the stores are all
chains. That's one of the reasons I loved OSCon. Portland rocks!
Likewise, I didn't make it to php{con West earlier this year because I
had no interest in hanging out in Santa Clara.



I really did not attend many of the sessions on the first two days. It
was all "Java this" and "Java that" and I'm just not that interested in
Java. So, I hung out and talked shop with Chris and Nat and others.



The third day, however, was the PHP and Perl track. This not only
motivated me to listen to speakers, but it even had me up in time for an
8:30 talk: Geoff Young's "Writing Tests with Apache-Test."



I've recently taken a big interest in test-driven development. Geoff's a
mod_perl guy and the mod_perl developers extensively use Apache-test in
mod_perl 2.0. Apache-test seems really cool because it lets you define
custom Apache httpd.conf settings for each test. Then, when you run your
tests, Apache-test automatically starts up a new copy of Apache using
your exact settings, runs the test, and then shuts down Apache. And it
does this for each test, so there's no side effects among tests.



PHP has its own testing framework (and PHP works with more Web servers
than Apache, unlike mod_perl). However, I'm interested to see if there's
some synergy between the two projects.



The following talk was again given by Geoff: "Why mod_perl 2.0 Sucks!
Why mod_perl 2.0 Rocks!" Geoff developed this talk from his experiences
(i.e. frustrations) trying to port his mod_perl 1.3 application to
mod_perl 2.0. It turns out mod_perl 2.0 Sucks! because it's really hard
to transition from 1.0 to 2.0. On the other hand, it's much more
powerful than 1.0; therefore, mod_perl 2.0 Rocks! Or does it suck rocks?
Hard to say. (Sorry Geoff.)



I remember moving from php/fi to PHP 3 and then again from PHP 3 to PHP
4. I'm interested to see how the move from PHP 4 to PHP 5 goes. Maybe
I'll end up writing "Why PHP 5 Sucks! Why PHP 5 Rocks!" Who knows? (Or
maybe Chris will do it, since it was his idea first.)



Speaking of Chris Shiflett, his "PHP Attacks and Defense" talk was the
most intesting presentation of security that I've ever attended. He
walked through developing applications that minimize the risk of
Cross-Site Scripting (XSS) and Cross-Site Request Forgery (CSRF)
attacks. It's amazing to learn how even "secure" pages really aren't
secure at all.



There was one interesting moment when someone asked a question that
proposed "fixing" HTTP to prevent one type of attack, but before Chris
could answer someone jumped up and said "I wrote the HTTP specification
and it specifically says you're not supposed to use GET to do these
types of things." For a moment, I was a little worred for Chris, but he
handled it really well by shifting the blame away from HTTP and onto how
it's implemented by Web browsers.



I guess the "Localizing BBC News for a Global Audience" talk was good,
but I mostly sat in the back and checked the Internet. Sorry. I was
mostly interested in serving up multiple languages using the same code
base and the talk turned out to be about writing an Apache module to map
URLs to different pieces of content.



Last, I heard "PHP 5 and Databases" by Marcus Börger. I'm already
playing around with SQLite and Iterators, so that part wasn't too new.
However, I picked up a little bit about PDO, yet-another database
abstraction layer.



All in all, it was definitely worth it.



What did you like best about ApacheCon?


6 Comments

shiflett
2003-11-22 13:24:50
Day one wasn't all bad
You gave a good talk on the first day, so those of us who aren't the least bit interested in Java had something to do. Maybe those who are stuck in PHP's XML hell now have some hope for PHP 5. Help is on the way, and you did a good job of showing how much better life is going to be for those who use PHP for their XML needs.


That was Roy Fielding who stood up in my talk and commented on the HTTP protocol. I knew exactly what he was going to say, so luckily I wasn't surprised. It was actually a very relevant statement. For those who are interested, take a look at section 9.1.1 of RFC 2616 (http://www.ietf.org/rfc/rfc2616.txt). This section makes the following statement:


"In particular, the convention has been established that the GET and HEAD methods SHOULD NOT have the significance of taking an action other than retrieval. These methods ought to be considered "safe"."


So, a request for an embedded resource such as an image should not have the side-effects that I was describing. Of course, as Web developers, we have to concern ourselves with the practical implementations in addition to the specifications. In practice, GET requests are often unsafe, as was demonstrated in my talk.


I also enjoyed the conference very much. It's nice to hang around people who are smarter than me. :-)

bazzargh
2003-11-25 02:26:22
What was the problem?
"shifting the blame away from HTTP and onto how it's implemented by Web browsers"..."[GET] SHOULD NOT have the significance of taking an action other than retrieval."


I can't square these comments to figure out what the problem was? That statement in the spec is a requirement on the server side of HTTP. The client can't force the server to "do an action other than retrieval", even for PUT requests. Which makes it odd that the blame is being shifted to "how it's implemented by Web browsers".


Looking at the original bugtraq discussion of CRSF (http://archives.indenial.com/hypermail/bugtraq/2001/June2001/0209.html) they talk about browser-based fixes for what is a server-side problem. Although they do point out a nasty problem with browsers allowing javascript's form.submit(), and button.click(), that's generating POST, not GET.


If anything, the bugtraq discussion put the blame on CGI toolkits (PHP takes a few knocks) for not forcing the user to distinguish between GET and POST.


So... what was the problem? (I'm just curious, not trying to score points against you two or PHP!)


Cheers,
Baz

trachtenberga
2003-11-25 07:40:57
What was the problem?
Chris will know better than I, but I believe the problem comes from allowing content to be updated from a GET request. (I guess I was incorrect when I said it was a web browser bug; this is more of a server issue. Sorry about that.)


GET should only retrieve; if you want to modify content, use POST. However, in practice, this is not how things are done, either in PHP or in any other language used to process a CGI request. Most developers don't distinguish between the two methods this way, and I'd guess many of them are unaware of this distintinction.

bazzargh
2003-11-25 09:54:51
What was the problem?
Thanks for the reply.


"in practice, this is not how things are done"... Well, I know you're more a $language=~/^P/ guy, but java's servlet API does have separate doGet(), doPost(), etc, which was a step in the right direction.


However, I've seen developers avoid this by simply overriding service() (which normally calls doGet, doPost, doPut... based on the http method). I brought it up with a developer who always did this in a code review a couple of years back; a week later, all his code had been updated to have doGet, doPost simply call a new "doGetPost()" method instead... talk about missing the point!


I'm not lulled into thinking everything is sweetness and light in the land of java, though. JSPs don't distinguish between GET and POST either, and the servlet API doesn't distinguish sufficiently between URL query parameters and POST data (e.g. because of this confusion, Tomcat makes the error of decoding them both using the same character encoding; whereas in fact HTTP request URLs are always UTF8, but the POST body encoding is specified in a separate HTTP header)


A better mousetrap might be one that made sure GETs were side-effect-free, e.g. by building on top of a capability framework that didn't give GET requests processors write access to anything but logs (see eg http://www.eros-os.org/essays/capintro.html). I'm not sure that writing code in such a B&D environment would be fun, though.

shiflett
2003-11-25 14:04:16
Re: What was the problem?

The basic problem being discussed at that very moment (see PHP Attacks and Defense for more information) was that requests for embedded resources, such as images, could have unintended side-effects. An attacker can carefully construct an image tag to cause an unsuspecting user to make a request of the attacker's choosing. This provides a convenient way to bypass identification mechanisms, firewalls, and the like.

Someone asked why the HTTP protocol didn't support a header that would allow the browser to indicate whether it intended to be requesting the parent resource or an embedded resource, as this could then be used to distinguish the two. Roy's argument was that the protocol already supports something to help prevent this. Because requests for embedded resources are always GET requests, the fact that GET should be considered safe should prevent this type of attack already. Of course, PHP lets you determine the request method easily enough, but there are a lot of people who write applications that do not adhere to this, and I'm sure I've been guilty of it from time to time. In addition, if you have register_globals enabled and don't specifically check the rquest method, your programming logic might not care what the request method is.

So, yes, the problem isn't in browsers or even Web servers; it's the developers. Roy knows this very well, I'm sure. He only commented in order to address the question being asked that proposed fixing the protocol that he helped write.

shiflett
2003-11-25 14:15:56
What was the problem?
It has absolutely nothing to do with the server-side language being used. Even when we were all writing CGIs in C (before we used Perl for this task), we could still perform "unsafe" actions as a result of a GET request.