More specifically about OS X: support for 32-bit hardware was dropped in 2011, but support for 64-bit applications has been available on all 64-bit hardware since 2007, even if you were running the 32-bit OS kernel. Apple handled that transition way better than Microsoft.
To be fair, microsoft's userbase requires backwards compatibility a lot more than apple's does, since Macs are largely consumer systems rather than enterprise ones.
Microsoft got flack for dropping 16bit app support in 64bit OSes (since 64 bit windows doesn't have NTVDM). They'd be flayed alive if they dropped 32bit support.
32 vs 64-bit app support is quite separate from 32 vs 64-bit kernel-space drivers. 32-bit operating systems need 32-bit drivers and 64-bit operating systems need 64-bit drivers. 64-bit operating systems can run 64-bit apps and 32-bit apps.
Crucially, however, 32-bit Mac OS X 10.5 could run 64-bit apps just fine. That, combined with OS X's universal binary support, made it easy to start deploying 64-bit application code far earlier than most Windows apps began transitioning.
Most of Apple's computer models were only available with 32-bit x86 for less than a year, so the installed base that needed their applications to still be 32-bit was minuscule compared to the installed base of 32-bit Windows editions that couldn't run 64-bit apps even if the CPU was 64-bit capable. Most Windows application developers faced bigger challenges in deploying 64-bit code than Mac app developers, and the Windows devs' work would benefit a much smaller fraction of their userbase. Windows app developers don't switch to 64-bit unless they absolutely have to. (e.g. this year's 64-bit re-release of Skyrim, to accommodate the address space requirements of heavy modding. Skyrim was originally released in late 2011 as a 32-bit only game despite listing CPU requirements that couldn't be satisfied by any 32-bit CPU.)
It does help some, but probably the biggest advantage is that they simply didn't have to support x86 machines from before 2006. By that point, ACPI, AHCI, EHCI, OHCI, HDA drivers would get you a working system, except for 3D graphics and networking. That's why Hackintosh systems are viable: basic PC hardware is extremely standardized now compared to what it was in the 1990s.
Apple's hardware platform control helped them cut down on the number of network drivers they needed to implement, and their decision to do the 3D graphics drivers in-house didn't save them much effort but made things a lot more straightforward.
By contrast, Windows 7 officially supports GPUs from 2001 running on systems that could be quite a bit older, from the days when there were half a dozen third-party chipset vendors that couldn't properly implement a spec even when they were trying to make standard interfaces. Windows 10 moves the cutoff forward quite a bit with its NX bit requirement, but still targets a diversity of core system components that simply didn't exist by the time Apple entered the x86 market.
Microsoft and Apple both have to contend with pretty much the same driver challenges for peripherals like printers, although it tends to be easier to put those drivers in userspace where 32-bit vs 64-bit isn't a showstopper.