I learned assembly language back in the 8 bit microcomputer era. I started on a Commodore 64 learning the 6502 processor and eventually graduated to an IBM PC learning 80x86 assembly. I retain a significant amount of information from this era. This has come handy a few times during my career when looking at mixed source/assembly. While at Microsoft (yes I spent time there) I had occasion to identify a bug in the Visual Studio debugger which would not have been possible without my previous experience. Debug points are set in the Intel 80x86 through an INT 9 software interrupt. The debugger needs to replace values in a code segment and when it ultimately handles the breakpoint needs to put back the values it overwrite. I had some C++ template code where clearly this was not happening.
Nevertheless assembly language knowledge has become extremely niche. Unless you are working on embedded devices it has become mostly irrelevant. And with virtual machines (Java's and Microsoft's CLR) becoming accepted due to the high computing power afforded people's code people are that much further removed from having visibility into this layer of computing.
I could say, "Yeah, you should learn hardware digital design and Karnaugh maps since it will improve your understanding." Yes I suppose learning hardware design will give you even MORE insight into computing but assembly language is starting to go in this direction - diminishing returns.
Hey, I'm all for expanding my horizons through learning but in this time of dozens upon dozens of APIs that software people have to worry about, specialization is key and assembly language has become mostly irrelevant given its very low priority to get the job done.
Sure there are certain instances where it can be handy but those times have been very few and very far between.