If you consume the classes directly in the application there is no way to avoid having the data/business assemblies living with the application assemblies. As such, there is no way to prevent a developer from referencing those other binaries and consume classes from them.
It really comes down to developer practice. You have to establish the culture of the development architecture with the developers. They must know that they only build UI against BLL libraries. Yes, the binaries for the DAL are there, they have to be, and you can't programmatically hide them. The developers just have to know to not consume them directly.
Now, all this isn't to say you can not architect a system that hides your layers. If you architect your system to be service oriented you can isolate the layers. For example, you could build a business layer application. This could be a windows service or web service that exposes the business model thru the service interfaces (remoting, web services, etc). The UI applications can talk to the business layer thru the services instead of directly consuming the business classes. This can be done with the database layer as well.
Designs like this are how large enterprise systems like SAP are built. They have a database server, a database application, a business application and the client applications (and certainly additional levels in between). These are all different program processes. The client apps do not connect directly to the database server. Instead they connect to the application servers through remote procedure calls (like remoting). (In reality the client app is really a thin client or dumb terminal. All the work of the application is actually done by the application server.)
Making the jump to this type of architecture is very difficult. It requires very solid architecture and very good practices of logging and error management in order to minimize debugging problems. For lightweight applications it's far too complex and just plain overkill. It is a very cool concept though.