That's the way client-server systems work - data is only pulled to the client when it requests it, there is no way to originate data transfers from the server. By moving your Access database to a separate machine and letting multiple users access it from their own PCs, you have turned your system into a client-server model (that's not a criticism - there's no other way of doing what you want to do). So the basic answer to your question is "no".
How critical is the requirement to keep the on-screen data up to date? You could easily include a command button that would initiate a refresh whenever the user wants an update - this would mean that data only flows over the network when it is needed.
Bigger client-server systems (i.e. not Access!) are often written with "optimistic locking", that resolves multi-user issues (where one user is trying to modify a record on thier PC that has already been updated by another user). This is usually done by having a date/time field on each table, that is updated every time a record is modified. This value is passed to the front end along with the rest of the record, and returned to the server with the update, so any update which has an old timestamp should be rejected (because someone else has modified this record in the interim). Hope that all makes sense - I've covered quite a lot of ground in a short paragraph!
If it is very important that the on-screen data is current (which makes sense for a medical database!), then you will have to adopt a strategy similar to the one you describe, regardless of the network traffic problems. I've not tried to do this, so I can't say whether 12 users would be unworkable - my recommendation would be to suck it and see. You could look at generating some kind of unique value for each PC, and basing the re-query time on that, so they don't all re-query at the same time (are your PC clocks kept synchronised to a standard time by a central server? Many organisations do this nowadays).
Hope that helps...