Experience Things in a Whole New Way



The Samsung SUR40 with Microsoft PixelSense brings people together to connect, learn, and decide. It enables experiences that change the way people collaborate and connect with a stunning 360-degree interface. The SUR40 sees and responds to touch and real world objects—supporting more than 50 simultaneous inputs.

This experience comes to life in the 40-inch Samsung SUR40 display which can be used as a table, on the wall, or embedded in other fixtures or furniture.

Touch the Possibilities

How can you experience the Samsung SUR40 with Microsoft PixelSense in your organization? The only limit is your imagination. Whether you're in hospitality, retail, healthcare, professional services, or the public sector, you can change the way people interact with information and with each other.
  • Make content more engaging. Give your customers immersive and collaborative ways to engage with photos, videos, documents, maps, custom applications, and more.
  • Plan and simulate. Bring to life real-time "if/then" modeling and visualization, simulations and calculations—perfect for financial services, healthcare, and other consultative environments.
  • Make learning more fun. Breathe new life into the education process with rich visualizations that encourage teamwork and enhance learning.
  • Transform the shopping experience. Make shopping more immersive by connecting customers with more options, recommendations, product and service comparisons, and personalized service.
  • Connect with customers through games and pastimes. Have some fun by putting the SUR40 in restaurants, bars, hotel lobbies, and other venues, associating memorable experiences with your brand.
  • Communicate and connect. Give people an efficient and intriguing new way to get the information they're looking for—like maps and tourist destinations in a hotel lobby. Or use it to help them exchange personal information so they can connect with each other and to your business.

Creating Applications for Microsoft PixelSense



Use the Microsoft Surface 2.0 Software Development Kit (SDK) to build applications for the Samsung SUR40 and Windows touch PCs. The SDK works with Visual Studio and targets the .NET Framework 4.0 running on Windows 7. Get started today with the resources available in the Microsoft PixelSense Design and Development Center.

Applications for the Microsoft PixelSense are built using the Microsoft Surface 2.0 SDK, and are ideal for any scenario in which users need to interact together on a single device. The focus of these applications is on creating real connections—whether it’s connecting customers with information and each other, or connecting the device to other devices. Microsoft Surface 2.0 SDK supports Windows 7 Professional (32-bit and 64-bit), so you can use a Windows 7 touch computer for basic testing or the creation of touch applications. And with more administration tools, deploying and managing the Samsung SUR40 in an organization is easier than ever. The Microsoft Surface 2.0 SDK builds upon:
  • Embedded Windows 7 Professional 64-bit
  • .NET Framework 4.0
  • Windows Presentation Foundation (WPF) 4.0
  • Microsoft XNA® Framework 4.0
  • Windows PowerShell and DMTF DASH support, and enhanced administrator tools

The Power of PixelSense


Microsoft PixelSense allows a display to recognize fingers, hands, and objects placed on the screen, enabling vision-based interaction without the use of cameras. The individual pixels in the display see what's touching the screen and that information is immediately processed and interpreted.

Think of it like the connection between the eye and the brain. You need both, working together, to see. In this case, the eye is the sensor in the panel, it picks up the image and it feeds that to the brain which is our vision input processor that recognizes the image and does something with it. Taken in whole…this is PixelSense technology.

How Does PixelSense Work?

What is Surface?

A step-by-step look at how PixelSense works:

  1. A contact (finger/blob/tag/object) is placed on the display
  2. IR back light unit provides light (though the optical sheets, LCD and protection glass) that hits the contact.
  3. Light reflected back from the contact is seen by the integrated sensors
  4. Sensors convert the light signal into an electrical signal/value
  5. Values reported from all of the sensors are used to create a picture of what is on the display
  6. The picture is analyzed using image processing techniques
  7. The output is sent to the PC. It includes the corrected sensor image and various contact types (fingers/blobs/tags)

Community Resources