<t>For numerous reasons, No.<br/>
<br/>
Why is explained in this MSDN post.<br/>
<br/>
First, from a performance perspective the pointers get larger, so data<br/>
structures get larger, and the processor cache stays the same size.<br/>
That basically results in a raw speed hit (your mileage may vary). So<br/>
you start in a hole and you have to dig yourself out of that hole by<br/>
using the extra memory above 4G to your advantage. In Visual Studio<br/>
this can happen in some large solutions but I think a preferable thing<br/>
to do is to just use less memory in the first place. Many of VS’s<br/>
algorithms are amenable to this. Here’s an old article that discusses<br/>
the performance issues at some length:<br/>
https://learn.microsoft.com/archive/blogs/joshwil/should-i-choose-to-take-advantage-of-64-bit<br/>
<br/>
Secondly, from a cost perspective, probably the shortest path to<br/>
porting Visual Studio to 64 bit is to port most of it to managed code<br/>
incrementally and then port the rest. The cost of a full port of that<br/>
much native code is going to be quite high and of course all known<br/>
extensions would break and we’d basically have to create a 64 bit<br/>
ecosystem pretty much like you do for drivers. Ouch.</t>