Strategic Informatics

A blog about the strategic application of technology

By

Adding MX records to AWS Lightsail for GMail hosted accounts

In migrating several domains to AWS via Route53 and setting up a few application instances via AWS Lightsail I discovered that to make things turnkey, AWS sets up a different set of name servers for Lightsail instances so DNS name server entries need to point to Lightsail, not Route53 name servers.

My particular hosting need for a Lightsail hosted instance had mail hosted by Google so here is a quick rundown of how your Lightsail specific MX records should look when pointing to Google.

Remember that in order for Lightsail to properly utilize these records your domain name needs to be pointed at the Lightsail name servers which can be found by logging into AWS Lightsail console and clicking Networking. Here you will find your Global DNS Zones for your hosted application instances. Select your specific domain from the list

Then you will see an option to “+ Add record” which is where you can add your specific MX record entry to route mail. Note you can still map to a domain name not strictly an IP address.

Note: You should confirm your specific MX Record settings in your Gmail admin console

By

Lessons Learned from Implementing a Service Oriented Architecture in Healthcare

I thought I would take the time to post some lessons learned over the past three years leading a ground-up initiative at WellMed Medical Management creating a service oriented enterprise application for physicians and medical management staff to treat and transition care for patients.  Service Oriented Architecture (SOA) is an often misunderstood concept even among the ranks of IT professionals.  While many SOA initiatives have manifested and exposed internal business logic in the form of web services the approach for SOA is actually very non-technical and is rooted in a deep understanding of the business and strategic goals and involves an ever evolving process of continuous improvement and refinement.  This makes the approach both strategic and operationally focused.  In this post I will outline what I feel are some critical components necessary for a successful SOA-based project.

 

Understand the Business Domain Model

While many companies evolve over time it is important to ensure your view of the business domain evolves with it.  Healthcare organizations are no different.  Communication is key especially in rapidly changing and evolving environments such as healthcare and defining a clear functional domain model will help pave the way for future development or shifts in business.  Find natural lines of separation within the business and look for natural service boundaries which allow you to build services that make sense.  By creating separation of services boundaries you help define sources for data and information and avoid duplication within your architecture.

Lines of separation can be found between:

  • Different lines of business (e.g.. Medical Management, Clinical, Research, Transportation, MSO, HMO, etc…)
  • Product & Service Lines (e.g., Transportation, Disease Management, Chronic and Complex programs)
  • Product Variations

Understanding the business is critical to ensuring that IT is aligned adequately to support business functions.  IT is often a common service that crosses multiple business functions.  Common services that cross boundaries include infrastructure, data warehouse, applications, development, etc…  Through this modeling effort many IT organizations can quickly identify gaps in existing service and support for the business community.

 

Maintain a Common Definition of Terms

In complicated environments like healthcare where efficiently managing risk for medically managed patients and members is critical to the success of your business you must take a solid look at the data flowing into and out of your organization.  Having a solid definition of what a bed-day is from a hospital visit, as an example, is important from both a financial perspective and a medical management perspective.  Stratifying patients based on acute events such as historic hospitalizations, lab results, HRA’s, or audits are equally important for the management of that patents care.   Both finance and medical management aspects of the business are critical to the patients care yet the consumption of information can be viewed and assessed slightly diffent unless a common method for defining data and terms associated with managing your business are clearly defined.

 

To effectively define these terms you must:

  • Build from business domain knowledge
  • Evangelize the terms and their correct usage
  • Introduce new terms slowly
  • Seeking definitions that are both
    • Unambiguous
    • Context insensitive

 

Maintain a Consistent View of the Ideal System

 

I’ve always been an advocate of having a strategic and long-term vision for a product.  Knowing where you are going will help you make day-to-day decisions that will ultimately help you get to where you need to be.  The ideal system will often seem unrealistic to many but setting the bar higher than others will help make your organization more agile and ready to change as market or other external factors shape your business.   Our approach to building a strategic enterprise application included components for mobile devices (pre-iPad) and both patient and provider portals.  They will not always manifest themselves as a product immediately but setting the groundwork and service breakdown will allow you to readily transition to other products or services.

 

The following items will help you build a consistent view of your ideal system:

  • High level design or model of the:
    • Goal system and
    • Intermediate steps
  • Consider all relevant aspects
    • Hardware/Networking
    • Services/Communication Protocols
    • Data/Access
  • Keep it in a maintainable form
  • Evangelize the roadmap

 

I’ve found that keeping a current copy of the ideal system will help you in many other aspects as well such as quickly describing the business to potential employees, vendors, partners, and even internal staff.

 

Seek Opportunities to Advance the System

 

Service Oriented Architecture is a concept and proposition you must be dedicated to and not a passing trend you can close as part of a project.  As such you must change your mindset and approach to all your projects.  Often SOA initiatives are grounded in major strategic initiatives.  Like any major IT initiative it should fundamentally support the businesses core objectives and strategic goals.  Your choices from a day-to-day perspective should seek to advance this strategic effort and build upon the shoulders of what has already been created.  A core tenant of SOA initiatives is the concept of re-usability.  When building new services or implementing new features you should always seek opportunities to advance the system as a whole.  Ways to do this include:

  • Avoiding Big Bang Architectural Changes
  • Implement the final system in small steps
  • Places to look for strategic opportunities include
    • New lines of business
    • New clients or partners
    • 3rd party software updates
    • New vendor software that complements your core products (i.e. Med Management & EMR)
  • Incorporate changes with the highest potential return
    • Looking for small changes with the highest amount of return
  • Seek to learn from each opportunity

 

Evangelize the Vision

The architect is a business leader and will often be your biggest advocate for driving business change using technology.  This is a very collaborative role and this person must work closely with executives and have a firm understanding of business trends, strategic initiatives and goals so changes or shifts are adequately made within the application and technology architectures supporting the business.  Ways to evangelize the vision include:

  • Continually show the company
    • Where the IT end of the business is headed
    • How it’s going to get there
    • Why it should go there
  • Create opportunities in
    • Design Meetings
    • Architecture, Development & Governance meetings
    • Hallway conversations within IT and with senior leadership

 

 

Continuously Improve Everything

 

Lastly, I would say continuous improvement is an overarching requirement and mindset you must install in all of your initiatives.  This can’t be drove from the architect or developers alone but must involve changes in processes of the business departments.  In our case it involved the close interaction of  infrastructure, executive management, line workers (nurses, case managers, health coaches, etc…).  Like the agile methodologies we put in place to develop our enterprise product you must have continuous interaction and have the mindset of continually improving your product and services.  To do this you and your entire team must:

  • Seek a to maintain a better understanding of the business even as it evolves and changes
  • Add and refine terms in the domain dictionary
  • Evangelize, Evangelize, Evangelize…. (E Cubed)
  • Seek alignment between the business and IT and use changes in business as opportunities

 

This is by no means an easy process but evoking change within an organization, especially a large one, is not a simple undertaking.  I’m sure I will have more to add as time goes on but take these little tokens of knowledge, go-forth and build your own agile enterprise applications.

By

Recovering a hard driving using SpinRite on a Mac using VMWare Fusion

I recently found myself faced with a 2.5″ NTFS formatted laptop hard drive from a 5 year old laptop that wouldn’t cooperate and would constantly fail so I decided to try and recover what I could from the drive by running SpinRite, a great application from Steve Gibson of Gibson Research that I have used successfully many times in the past to recover damaged or unreadable magnetic based media.  The last time I actually used it was about 12 years ago over the course of several days to recover a failing HD.  Computing and hard drive technology has changed a lot since then but they are still very much part of our day-to-day IT lives.  When I encounter a problem that needs extensive evaluation I would just run SpinRite on the x86 based PC from which the hard drive came.  However, after creating a bootable CD and USB key with SpinRite for use on the 5 yo laptop neither one would work so I decided to take a different route.  Without another PC handy I decided to assess my options…  My daily laptop is a MacBook Pro, doesn’t (thankfully) have an internal 2.5″ SATA bay and is sealed tighter than the sub in the Hunt for Red October…  So what’s a Mac user to do with an NTFS formatted magnetic HD that can’t be read, an old Laptop that won’t boot SpinRite, and no other PC’s within easy reach?  Try to run SpinRite from a Virtual Machine on a Mac of course….

Now for those of you who don’t know SpinRite was written in assembly and does very low level reads and writes against a computers magnetic mass storage drives.  FreeDOS has been incorporated into SpinRite distribution to allow it to boot to a bare-metal PC and mount any connected drives so you can exercise the individual bits of 1’s and 0’s stored on the drive, exercising it enough to get a magnetic drive in as good a working condition as the physical hardware will allow.  With any luck it will operate just well enough to get your information to a readable state and backed up before you have complete hardware failure…   Running SpinRite from a VM was a bit more involved to configure via VMWare Fusion on a Mac and I wasn’t completely sure it would work… so I thought I would share my experiences.

 

I created a spinrite.iso file from another Windows VM I use.  I then created a new MS-DOS based VM mounting the SpinRite.iso created from the SpinRite.exe file.  It booted to a familiar screen without any issues.

SpinRite_and_Blank_website__Blank_site__Nothing_to_see_here_

SpinRite_and_Blank_website__Blank_site__Nothing_to_see_here_

Now the challenge was to get the physical hard drive mounted to the VM…  Looking through the settings there was no way to get RAW access to a physical HD.  I used a SATA to USB adapter and had to connect the drive to my Macbook Pro ensuring it was mounted to the Mac not the VM.

I needed to create a Raw Disk vmdk to make the RawDisk accessible to the VM so I did the following:

From a Mac terminal (I prefer iTerm) type:

diskutil list

In my case the 160GB HD came up as /dev/disk2 but your particular configuration may be different.

1__bash

 

Next from the terminal run the following command to list the partitions that rawdiskCreator can see:

/Applications/VMware Fusion.app/Contents/Library/vmware-rawdiskCreator print /dev/disk#

Note: Ensure that the last entry /dev/disk# is changed to the drive you are targeting for raw access.  In my case it was /dev/disk2

1__bash

What you should see next is your drive partitions…  My particular drive was split into two partitions (#1 was very small and #2 made up the bulk of my 160G HD)

With your partitions known and visible by the rawdiskCreator tool you can create the vmdk file that refers to the physical hard disk you are trying to mount and make it available to the existing SpinRite VM you created earlier.  You will need to know the location of the Disk and the partitions you want to mount from the previous command, which in my case is /dev/disk2 1,2 which says it’s disk2 and both partitions 1 & 2.  You will also need the path to the actual SpinRite .vmwarevm Virtual Machine that you created earlier (in my case ~/Documents/Virtual Machines/SpinRite.vmwarevm/rawDiskFile).  Now I used rawDiskfile but this is the name of your vmdk file and can be called whatever you like.  Make sure to include the ide designator at the end so the VM knows how to mount the drive.

/Applications/VMware Fusion.app/Contents/Library/vmware-rawdiskCreator create /dev/disk2 1,2 ~/Documents/Virtual Machines/SpinRite.vmwarevm/rawDiskFile ide

After you execute this command successfully you can option click the SpinRite.vmwarevm file and choose Show Contents.  Here you should see the files that makeup the .vmwarevm file including the new .vmdk file (if that’s what you named it) for each partition you listed above (1,2).  In my case it was rawDiskFile.vmdk & rawDiskFile-pt.vmdk

If you boot the VM now you won’t see the additional drive so you have to manually edit the configuration file for the VM to recognize the drive.  With the VMWare file contents still being displayed in finder you need to edit the .vmx virtual machine configuration file.  In my case it was called SpinRite.vmx because SpinRite is what I named my VM…  You should probably back up this file incase there is a problem and you need to start over.  Use your favorite editor (BBEdit, TextWranger, TextEdit, etc…) to edit the .vmx configuration file.  You want to insert the following lines to your configuration file being careful not to duplicate an existing ide#:# entry:

ide0:1.present = “TRUE”
ide0:1.fileName = “rawDiskFile.vmdk”
ide0:1.deviceType = “rawDisk”
suspend.disabled = “TRUE”

If the VM already has in its .vmx configuration ide0:1, use another port such as ide1:1.  It is also possible to use scsi#:# or sata#:# if the VM is somehow configured to use a SATA or SCSI controller.   The suspend.disabled=”TRUE” entry prevents the VM from suspending and being out of sync with the attached HD.  Important since most of SpinRite’s scans can take a long time to run.

The last step is to power on the VM and select your HD…  You may be prompted to enter your administrators password to get RAW access to the HD as the VM powers up.

Screenshot_4_30_15__4_12_PM

If you run into trouble it may be necessary to unmount the HD from your Mac by ejecting or un-mounting from Disk Utility prior to turning the VM on.

I won’t go into detail on how to use SpinRite as the tool is pretty self explanatory but the 160GB HD partition did appear in the interface ready to begin SpinRite’s operations.

SpinRite

SpinRite_and_Blank_website__Blank_site__Nothing_to_see_here_

SpinRite

 

Again this certainly isn’t an ideal setup as SMART access to the HD wasn’t available from within SpinRite menu options because, and I’m guessing here, of the SATA to USB setup but it might work in a pinch.  Hopefully it proves useful to your IT Toolkit and helps you extend the life of your SpinRite license which is worth every penny…

Update:  I tried a couple of drives and while it worked for one drive there was an error that completely stopped SpinRite and the VM in it’s tracks…   It only occurred on a specific section of the hard drive where there was clearly an issue…

SpinRite

SpinRite

By

Performance issues with VMWare Fusion 6.01

I’ve had relatively great success with running Apple’s OS X Maverick since I installed a developer release on a second generation MacBook Air in June while at WWDC 2013.  No issues.  None.  Never-the-less, I hesitated updating my primary Mac with developer pre-release software.

When VMWare announced during this beta that VMWare Fusion for OS X was upgraded to version 6 with specific focus on compatibility with Mavericks and Windows 8 I promptly upgraded.  Running Windows on OS X 10.8 was every bit as fast as if it was running on native hardware.  No complaints…

When Apple announced the general release of OS X Maverick’s I upgraded my primary machine a 2 monitor setup to which I was looking forward to with the additional options available.  I upgraded, again, without issue.  When I started to run Windows 8 VM inside of VMWare Fusion I noticed a performance hit…  Ugh…  I knew it was too good to be true.  I can accommodate a lot in order to help satisfy my inquisitive mind however when it comes to daily workflow I have much smaller levels of tolerance.  I knew the performance issue probably had something to do with Maverick’s so I dug a little deeper and looked at the newly  refreshed Activity Monitor in OS X.  Nothing exceptional in the CPU and Memory tabs.   CPU was at a reasonable level before and after running a virtual machine in VMWare Fusion and I have plenty of room with 16GB of memory.  Nothing seemed to be pegging either metric.  I looked at Disk and Network… again nothing out of the ordinary.  Knowing that Energy was a new tab I haven’t seen before in Activity Monitor I selected it and noticed the “App Nap” column was listed and indicated “Yes” in the row defining VMWare Fusion.   App Nap is the new feature in Mavericks that allows users, thankfully, to allow OSX to put to sleep applications consuming excessive amounts of your limited battery power.  A great feature for laptop users but I’m using a desktop and plugged into AC.  Not as big of an issue at my desk.  Activity_Monitor__Applications_in_last_8_hours_

 

 

A quick scan of Apple’s support site led me to how to disable this feature on an app-by-app basis…  Here’s what I did:

1.  Open a Finder window and navigate to your Applications Folder

2.  Locate VMware Fusion, right click and select “Get Info”

3.  In the “General:” section of the dialog box you will see a checkbox for “Prevent App Nap”.  Make sure this is unchecked.

VMware Fusion

 

 

I terminated and restarted VMWare Fusion and launched my Windows 8 virtual machine and so far so good.  Performance picked up and I’m back to my original daily workflow.   Hope this helps other VMWare fusion users who might have similar issues.

By

Certified ScrumMaster

I wrapped up training two weeks ago to be come a Certified ScrumMaster.  The course was a solid foundation on the fundamentals of Scrum and the process of product management and development in an agile environment.  While I’ve participated in Scrum sessions many times in the past this formal indoctrination into the process was very insightful and will prove very useful in future meetings with teams.  Most sessions you see in practice are variations of the real process which, to the Scrum Alliance’s credit, has evolved and been refined over the years is actually very good.  With some effort, practice, and discipline of your teams the breakdown of product backlogs, into deliverable sprints could prove fruitful for any product focused company.

By

Book Review: REWORK

I just finished reading one of the most refreshing books I’ve come across in quite awhile.  The team from 37signals offer up a great perspective on creating a fast moving agile organization that focuses not on one-up’ing your competition but delivering services or products that actually make a difference to your employees and customers.  To borrow from the book what you don’t do is just as important if not more so as as what you do decide to offer in a design, product, or service.   Fried and Hansson show how easy it is to get lost in the mainstream of today’s larger organizations and how many organizations become handcuffed by many of the typical notions of how business should be run.

 

 

Rework offers a refreshing series of essays covering a variety of topics many startups and nascent entrepreneurs would be wise to follow.    From the cultivation of a “culture” to the core of what drives your business…the people and hiring those who are really driven and will help your organization grow by sparking a passion and desire to build not just profitable but exciting organizations your employees and customers can get behind.

You can find Rework on Amazon here

By

Social Network Data Visualization

Linked-In Labs has a new data visualization feature that shows the links between your different contacts and relationships within LinkedIn.  Simply log-in to inmaps.linkedinlabs.com and you can see the relationships unfold.

Above is a copy of my map of contacts and their corresponding relationships primarily in the healthcare and IT fields grouped by relationship types, affiliations, and general experience domains.  You can find other inventive concepts at www.linkedinlabs.com.

By

Designing an Electronic Medical Record

Some big ideas come from asking pretty simple questions…

As it turns out the inspiration and vision for the development of an innovative Care Delivery Platform at WellMed was born out of necessity.  A necessity to accurately and efficiently deliver timely information about patients health status to caregivers at the point of care and document in a way that did not inhibit the caregivers ability to deliver quality care to seniors.

Many physicians are burdened with a mass of paperwork detailing patient activity.  The simple idea of putting relevant and timely information in the hands of a physician when it is most important to help manage risk and provide adequate decision support at the point of care is a goal many EMR vendors have tried to achieve by simply aggregating and archiving data.  There are several decision support tools available for physicians that manifest themselves in different ways such as ePrescribing applications, Reference Material, Patient Education Handouts, Risk Adjusted Payment Attestations, Document Management Repositories, Clinical Protocol sheets, etc… some within a single application, many others completely disconnected and disjointed requiring the user to log into multiple applications to perform seemingly simple functions.

When you think about it clinicians and many tertiary healthcare providers live and work in an environment where the vast majority of documents sent and received are in paper form and it is still growing.  When you add electronic information in the form of e-mail, document management systems, and patient data to the mix it becomes inefficient and cumbersome for providers to effectively manage.

We’re two decades into the internet revolution, and despite many efforts to create an all electronic clinic, paper is still the predominant method of healthcare communication in this country.  It’s 2010 and many providers get their documents more or less the same way we did 200 years ago!!!

That’s absurd…

While we can’t change an entire industry we can start by looking at ourselves and how we deliver healthcare to seniors.  When I started with WellMed 2 years ago I was inspired by our CEO’s vision and approach to wanting to leverage technology to gain efficiencies and better manage risk for our patients to keep them healthy and out of the hospital.  At the same time I saw his frustration with current solutions that did not allow us to progress to the next level of patient care.  The truth is we will never completely get rid of paper, this is a common misnomer, but we can manage it more effectively and make it more accessible to clinicians.  What the our clinics needed was a new, complementary approach for managing patient information.

So here is the simple questions that led to the creation of the Care Coordination Platform:

What would it take to deliver quality information to physicians at the point of care?

What would empower physicians to help deliver quality care for seniors and help improve outcomes?

After much thought and effort we think we’ve got an approach that nails these things.  It brings efficiency, new benefits for both business and physicians and more importantly our seniors.  So far the feedback and the demand for a solution have been terrific.

As we get ready to bring the solution we started developing a year ago to physicians we will continue to evolve the platform to include many other data elements in the patients continuum of care.  We have not been sitting idle and have solicited feedback from many providers and will continue to be engaged with all users of this new system.  Initial feedback has been great and we’ve got lots more work ahead, but we are off to a great start.

When we introduce the EMR Preview in 1Q 2011, it will have exceeded our current EMR functionality in many ways and plan on quickly following with quarterly releases of additional functionality including ePrescribing, and document management integration in subsequent iterations.  I’m very proud of what our team has created in such a short amount of time and will continue to develop as we forge new ground and develop new integrations with other custom and vendor solutions.  I would love to show you more than just a teaser image as the user interface takes full advantage of rich internet application (RIA) functionality.   I truly believe what we have designed will empower our providers to delivery quality care to seniors.  Until such time we make our internal development efforts widely available to contracted providers I will instead focus future posts on our approach to architecture, user interface, SOA, and agile development.

By

Configure Evolution on Ubuntu 9.10 for use with Exchange

I downloaded and installed the final release of Ubuntu 9.10 Karmic Koala on my Dell M4400.  So far I’m very impressed with the fit and finish of this latest Linux release.  Evolution is one of the many e-mail clients avaialble for Linux but one of the very few with the ability to access your corporate Microsoft Exchange Server by default.  It is very similar to Outlook on your Windows desktop so you will feel right at home with this release.  Below is a walkthrough of how to connect Evolution to your Exchange 2003 server.

When you launch Evolution you will be presented with the following setup screens:

Screenshot-Evolution Setup Assistant

The first dialog box presented is “Restore from backup” which, as the title suggests, restore from a previous backup.  I skipped this step as this is a new installation.

 

2-Screenshot-Evolution Setup Assistant-1

The next dialog box is “Identity” which is where you enter your Full Name and e-mail address associated with the Exchange account you are trying to setup.   If this is your primary account you can leave the checkbox checked for “Make this my default account” as I did above.

3-Screenshot-Evolution Setup Assistant-1

In the “Receiving Email” dialog bx that appears next you will enter your Outlook username and Outlook Web Address URL.  If you are unsure what it is it will typically take the form of  https://domain name/exchange/.  Be sure to include /exchange after the URL if applicable.  Then press Authenticate to enter your Exchange password.  If successful you will see your Exchange mailbox username appear in he Mailbox: field.

 

5-Screenshot-Evolution Setup Assistant-1

In the “Receiving Options” dialog box you will see several options to set the frequency and limits of your Exchange e-mail box.  You can simply leve the defaults or modify them to your liking as I did above to improve security and frequency of e-mail delivery.

6-Screenshot-Evolution Setup Assistant-1

In the “Account Management” dialog box you configure the name of your newly configured account.  Work, Personal, or your Company Name works well here. 

7-Screenshot-Evolution Setup Assistant-1

 

Then that’s it…  Just press the Apply button and you are immediately brought to the Evolution application with its new configuration.  It will immediately start downloading your e-mail and associated folders, including Calendar, Tasks, Memos (Notes in Outlook), and Contacts.

By

Installing Windows 7 from a USB Drive

The installation for Windows 7 is much faster via USB than CD so I decided to cut as much time off any install as possible by installing the distribution media onto USB rather than the old-school circa 1998 CD-ROM method.  If you want to install Windows 7 via a portable USB key then read on…

NOTICE:  Pay close attention to each step.  Every installation is different so be sure to read through all steps before attempting to format your USB key.  It is easy to type the wrong drive ID when formatting.  I will not be responsible formatting the drive so proceed at your own risk.  If you don’t feel comfortable doing these steps then stop and do the install via DVD instead.

Blah…Blahh…Blahhhh….OK, now that the disclaimer is out of the way let’s dig in…

http://gopaultech.com/wp-content/uploads/2007/04/usbdrive.jpg

Step 1:  Grab a 4GB or larger USB key.  the entire distribution is a little over 2GB so 4GB drives work nicely.

Step 2:  Plug the key into a workstation with a DVD drive that can read the Windows 7 Install media.  I used an existing Windows 7 desktop but you can also use a Vista/XP workstation as well.

Step 3:  Open a command prompt.  If using Vista be sure to open the command prompt as administrator by right clicking on the command prompt icon and selecting “Run as Administrator”

Step 4: Type the following into the command prompt window

  • diskpart
  • listdisk (The disk number for my USB key was 1, be sure to identify the size to get the correct disk ID.  This is important)
  • select disk 1 (you are about to format the disk so be sure you get the correct disk ID...See above step)
  • clean
  • create partition primary
  • select partion 1 (mine was 1.  be sure to put your ID here instead!)
  • active
  • format fs=NTFS
  • assign
  • exit

Step 5:  Insert the Windows 7 disk into the same workstation you used to format the USB key

Step 6:  In the same command prompt window navigate to the boot directory on the DVD drive where the Windows 7 Install disk is located

Step 7:  To make the drive bootable enter the following command

  • bootsect /nt60 E: (Where E: is the drive letter assigned after you typed the assign command above)

Step 8:  With the USB key formatted and bootable you can now copy all the files from the Windows 7 install DVD onto the USB Key

When you boot your target PC be sure to boot via the USB option in your boot priority.  Some workstations have a key you can press during intial startup to change the boot drive.