Manage Learn to apply best practices and optimize your operations.

Best practices for remote management of medical imaging devices

Securing externally facing websites, especially those that include sensitive medical information, can be challenging. Learn more about best practices for remote management in this identity and access management expert response.

In my organization, we have some medical imaging devices on our network. The vendors require Web access to their devices for remote management. What is the best way to do this without compromising security?
This is a textbook case of how best to secure an externally facing website. The same rules apply whether the site is for customer use or vendor access, since both are outsiders going to the website. In this case, it's a special website for vendors only, so customers -- or anyone other than the vendor -- shouldn't be poking around on the site.

If the site were inside a corporate network, with access only for employees, the solution would be easier. The site would be segregated on its own Web server with limited access for selected employees based on their job roles, and would have access to the Internet or any other external network.

But that's not the case here. In addition, it sounds like the vendor may have its own Web-based application installed on the network for accessing the devices. That adds another security twist, since hosting someone else's Web software on a network increases the risk of malicious access.

Here are some suggestions for protecting both the security of a corporate network and Web infrastructure in this situation.

If there is a box available, the Web software provided by the vendor should be installed on its own dedicated Web server, regardless of whether the website is used for accessing the medical devices in an organization's own homegrown Web application or Web software provided by the vendor. The website should also have no access to anything on the network, such as databases or other internal back-end systems, other than the medical devices. And of course, it should be planted firmly in the company's DMZ, not anywhere on the internal network.

Next, make sure it's password protected. Don't use basic or digest authentication that comes packaged with many Web servers. Use a Web-based login page. If the application doesn't have one, have corporate developers build one, even if just for this application. It doesn't take much, just a simple HTML or JSP page as a front-end to the application where users will have to enter their IDs and passwords.

Each user at the vendor should have a unique user ID and password. Get a complete list of users and regularly audit it to prune out stale accounts. All users should have the same limited access -- to the website only -- based on the principle of least privilege. If there is ever a breach associated with the site, it can be tracked back to an individual user. Shared user IDs would make it impossible to identify a culprit, or track down the source of the malicious access.

Make sure the website only uses SSL to protect the confidentiality of the data being transmitted. SSL should start at the login page to also protect user IDs and passwords from being sniffed en route from the vendor.

Restrict access to the website by IP address filtering so users can only log on from an approved white list of IP addresses at the vendor. This will block access from someone other than the vendor, and will also prevent a rogue employee at the vendor from trying to access the site from another location. Remember though that IP address filtering is a weak control since IP addresses can be easily spoofed, and should be combined with other Web controls. It shouldn't be used as a standalone control, or in lieu of stronger controls such as user IDs and passwords.

Lastly, conduct regular scans of the website with tools like AppScan and WebInspect to check for vulnerabilities. Even if access to the site is restricted only to the vendor, it's still on the Web and exposed to malicious users who might try to exploit it. Use a robots.txt file to hide the site from spiders and crawlers, like those used by Google, to keep it from showing up in search engines.

This is just a brief rundown of basic Web server security. Consult the documentation for your particular Web server hosting the site, such as Microsoft IIS or Apache, for other specific suggestions for locking down servers.

More information:

This was last published in July 2008

Dig Deeper on Password management and policy

Have a question for an expert?

Please add a title for your question

Get answers from a TechTarget expert on whatever's puzzling you.

You will be able to add details on the next page.

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.