Blog Layout

  • Slide title

    Write your caption here
    Button
  • Slide title

    Write your caption here
    Button
  • Slide title

    Write your caption here
    Button
  • Slide title

    Write your caption here
    Button
  • Slide title

    Write your caption here
    Button

Constructing Haptic Systems

A special type of mechatronics known as haptic technology combines mechanical, electrical, and computational components. It provides users with more enhanced interaction with machines than existing traditional systems because of advanced sensors and actuators. Haptics gives users tactile stimuli including touch, pressure, weight, texture, and warmth in addition to visual and audio inputs from the computer. This encourages a deeper, more concrete link between our devices and us, elevating our use of programs to a more immersed state. In this blog, we will examine the advantages of haptics implementation for a variety of applications as well as the most recent design approaches for haptics feedback. 

Use Cases for Haptics 

Let's start by examining the ways in which haptics are already and will be used before asking why this is important or desirable. 

Medical 

Greater control and safety are possible in the medical industry, for example, by allowing doctors to feel what a robotic hand touch. Using haptic technology in surgical procedures like laparoscopic surgery, surgeons can make smaller incisions that heal more quickly for the patient. A surgeon may now execute delicate procedures with more precision thanks to remote-controlled manipulators and video. A surgeon needs to be aware of the force being applied by the knife. The incision is too deep and there is too much. Too little results in a shallow incision. A surgeon must also be aware of whether they are cutting through a blood vessel or simply shifting one out of the way. Force feedback is crucial in this scenario. 

Gaming 

Instead of using joysticks and keyboard clicks, haptics is used in gaming applications to give the user virtual feedback that resists control force and lets them experience the sensation of textures and other physical phenomena. To physically engage with a user, thus far, micromotors, piezo actuators, fluidic transfers, and air pressure have been used. But creating with these haptic technologies differs greatly from creating other, more conventional machine designs. 

To help engineers who are new to haptic technology, device manufacturers are fortunately addressing these demands through development systems and application examples. Accelerometers are a crucial piece of equipment utilized in haptic designs. These are utilized in remote robotic assemblies to deliver force feedback data, gloves to monitor hand motion, and headsets to adjust the field of view. 

Numerous device manufacturers provide development kits, application notes, reference designs, and accelerometers for OEM applications. Additionally, because accelerometers are widely used in cell phones, these multi-axis devices are inexpensive and easily accessible from well-known distributors and manufacturers. A common accelerometer development kit includes multi-axis sensors and a USB, I2C, SPI, or UART computer interface. Measurements up to 16G are not unusual, and outputs might be digital or analogue. 

 

 

Consumer Products 

Haptic designs are increasingly incorporating Inertial Measurement Units (IMUs) for applications that demand complicated motion recording and processing. IMUs are essentially sensors that include an accelerometer, gyroscope, and magnetometer. These highly integrated, ultra-low-power sensors can be tailored for a variety of high-performance uses, such as wearable technology, head-mounted technology, smartphones, cameras, drones, and augmented reality (AR) and virtual reality (VR) headsets. IMUs are a reliable smart sensor system package with ready-to-use software algorithms that can quickly calculate orientation, position, and velocity. This allows for position tracking and activity/gesture recognition with high accuracy and low latency. 

These multi-axis programmable smart sensor systems are also inexpensive and easily accessible from conventional distributors and manufacturers due to economies of scale and the ubiquitous use of IMUs in smart phones, cameras, drones, and other consumer gadgets. IMU development kits typically come with a multi-axis sensor, environmental sensors, and a computer interface like USB, I2C, SPI, or UART, just like accelerometers. 

Techniques for Haptic Design 

A number of design strategies have emerged because of the wide range of haptic technology applications, which engineers are still working to perfect. Some haptic designs include microfluidic techniques, which are also useful for producing sensation on the skin and pumping fluids into and out of a variety of chambers. Capillary tubes, microvalves, and pumps with micromotors are frequently employed. For the benefit of these microfluidic approaches, motor control technology is fortunately advanced, and a wide variety of motor control development kits are easily accessible. 

Microcontroller and Op-Amp Designs 

Op-Amps can often be used to power micromotors because they don't require a lot of current and can be driven in both directions. Microcontrollers with motor control capabilities, such as higher current drivers, pulse width modulation (PWMs), multiple timers, and even analogue outputs, can be used to drive the numerous motors, pumps, or micro-valves in applications where Op-Amps alone are insufficient to drive the micromotors. 

Processing of digital signals 

Operating micromotors and measuring back EMF, which can be used to evaluate resistance to digitally asserted pressures, benefit greatly from processors with digital signal processing (DSP) capabilities. A CPU section and a power transistor array are two examples of development boards. DSP-based haptic designs have a lot of potential for creating immersive experiences for a variety of media, including games, movies, music, and more. Haptic designs can improve user engagement and sensory stimulation by adding tactile vibrations to audiovisual information. Complex filtering algorithms can be carried out by processors with DSP capabilities for the application's many motors to be controlled precisely. These motor control approaches can also be employed to build fluid pump- and air-pressure-based sensory systems. Additionally, this technique can be modified to operate piezo actuators and ultrasonic emitters, as well as micro piezo actuators that can produce electromechanical sensation. 

 

Haptics using ultrasound 

A sophisticated haptic technology design also makes use of ultrasonic waves from an ultrasonic array that combine to create an impression of force. This kind of ultrasonic haptic technology uses focused ultrasound waves to generate mid-air haptic sensations so that users can feel feedback against their hands without actually touching a device. It has mostly been used to provide tactile feedback, simulating the feeling of hitting a virtual button, but its use is growing to excite and have a greater impact on the body as a whole. 

Hardware alone won't be sufficient for the upcoming HD haptics technology. Future haptic system designs must use software to get beyond the drawbacks of hardware-only approaches. 

Conclusion  

Although haptic design is a relatively new field, engineers can find development tools and advice online. More developer kits and application notes will appear as haptic products do. The gaming business will advance haptic technology more quickly and further than the medical, industrial, robotic control, and remote repair sectors. Haptic technology will be driven by readily available, greater volume applications to make specialized applications easier to build, opening opportunities for upcoming discoveries and uses. 


26 Mar, 2024
It is nothing new for us to adore audio, whether it be in games, theatre, or music. It has propelled us from the early days of stereo to sophisticated surround sound, inspiring the creation of elaborate home theatre systems and high-end audio equipment. However, the audio industry has always been quite individualized. One person may find something unsettling in another. Some of us get a kick out of the deep bass thrum, while others are drawn in by the crispness of the trebles. To add to the complexity, even for the same listener, a tune that sounds great with one set of equalizer settings may require tweaks the following time. The addition of surround sound confused matters even more. In our quest for audio perfection, the quantity of speakers, woofers, and tweeters appeared to increase infinitely as we progressed from the 5.1 systems to 7.1 and then 9.1. As soon as someone believed they had perfected their setup, cutting-edge technologies like DTS and Dolby Atmos emerged, adding new dimensions to the mix. During all these developments, spatial audio looks to be revolutionary. Customized audio experiences are introduced in place of a one-size-fits-all strategy. A unique audio profile is generated by means of comprehensive 3D scans of the listener's skull. It's not only about ear shape or spacing; it's also about listening awareness. The unique Masimo sensitivity of each listener is detected using in-ear microphones. The anatomical information is then combined with this sensitivity, which represents the way our ears react to frequencies. What was the outcome? a customized audio stream designed to give the listener an unmatched, immersive experience. How Immersive Spatial Audio? Head tracking is essential to creating a genuinely immersive spatial audio experience. You hear different things coming from different directions as you tilt your head in real time. Spatial audio attempts to replicate the immersive nature of life, but there is a catch. How does an audio processing engine in a home theatre know which way your head is pointing? Unless you add even more technology on top of it, it doesn't. Because of how your head is oriented, it is conceivable for video cameras to watch you while you watch a movie and pick up on what you hear. Another option is to put a cell phone on your head and track your head using the gyros and accelerometers on the device. Operating systems support the practice of some cell phone manufacturers integrating spatial audio processing into their devices. This might function, but not as effectively as a system that uses precise data to anchor your head position. This method of head orientation is being used in immersive gaming, which makes use of accurate data to provide a more immersive experience. Since the screen updates to reflect your gaze direction, using a VR headset enables the VR program to determine your head orientation. Furthermore, you will hear it from that perspective as well as your own. For this reason, video games have the power to advance technology. Firstly, compared to other applications, it is currently the most widely used. In addition, because players are drawn to the more immersive experience, game software developers will embrace this technology soon. The processing power and memory/storage capacity of gaming consoles allow them to store the spherical audio track required for spatial audio to function. Prospective Opportunities It is feasible that soon, accelerometers will be incorporated into earbuds and microphones, along with faster bidirectional wireless communications to enable additional markets to benefit from spatial audio. These developments will allow people watching symphonies in home theatres, for example, to rotate their head and hear a more prominent brass, woodwind, or string part, depending on where they are looking. This technique may also be used by military infantry to identify attackers in a forest, desert, or other concealed area when combined with extremely sophisticated and filtered directional audio microphones. When a soldier turns their head to select a target, their breathing and heartbeats can be filtered and utilized. Conclusion As we approach a time when audio will be able to be uniquely personalized like a fingerprint, we also need to recognize the difficulties and complexities that come with these developments. With its promise of hyper-personalization, spatial audio mostly depends on accurate head-tracking, a characteristic that may require additional complex technologies to be integrated. Since gaming is currently the most popular application, it continues to set the standard for other industries, including home theatre and possibly even the military. Although we might soon be donning VR headgear or earphones with accelerometers, the further future holds the possibility of an auditory experience that is not only audible but also tactile. As audio technology advances, we will be forced to listen, immerse ourselves, adapt, and change. Our search for the best possible listening experience is as limitless as music itself, always leading us to explore new avenues.
07 Dec, 2023
The design of a user's interface (UI) makes using a system easier for users. A user interface designer, for instance, makes ensuring that buttons, when pressed, logically display new information or initiate functions. However, applications for cars and other safety-critical contexts add another level of complexity to UI design. The overall safety of vehicles is decreased by a sophisticated user interface that even momentarily diverts drivers from the road. Because of this, automobile user experience (UX) is replacing automotive UI. Automotive UX is different from UI in that it describes the driver's interaction with a vehicle rather than the other way around. In contrast to a user interface (UI), which only lists functions and shows information on a screen, a user experience (UX) actively communicates with the driver through touch, visual cues, and auditory cues. Automotive UX technologies can alert drivers to critical information without becoming distracted when they are properly integrated. We'll look at how car user experience (UX) is changing to improve driver safety and provide a more natural and engaging driving environment in this blog. HUDs Maintain Driver Focus The introduction of heads-up displays (HUDs) has been one of the biggest changes in the evolution of the vehicle user experience. When important information needs to be communicated, "smart" digital meters that interact with the driver are able to totally replace analogue gauges in some cars thanks to head-up displays (HUDs). By providing crucial information to drivers without requiring them to glance down at the dashboard or navigate through an infotainment menu located in the center console, HUDs contribute significantly to vehicle safety. When the speed limit is crossed, for instance, the car's speed may flash or brighten, alerting the driver instead of making them do the math. In the meantime, alerts and messages about possible road hazards, traffic signs, and other things can be sent via the extra visual real estate. Currently, manufacturers are starting to tighten the integration between smartphones and HUDs in order to streamline non-driving tasks including music playback, call taking, and navigation. Ensuring that commands are carried out through visual or auditory means preserves the authenticity of the driving experience, especially in situations where there are sirens nearby or children arguing in the rear. Improvements to the Audio Turn on Hands-Free Operation Similar to the previously discussed visual or auditory confirmations, hands-free control is a potent technological tool for improving safety and streamlining user experience. Drivers can keep their hands on the wheel when they can just ask for what they want. Easy to use is a crucial component of a successful hands-free system, and audio control offers a far more user-friendly interface for functions like music, calls, navigation, and climate control that are not essential for driving. However, things weren't always this way. The first hands-free systems fitted in automobiles had convoluted menus that were challenging to find, particularly when looking for features that weren't utilized very often. Managing multiple drivers was another issue these outdated systems had, which led to annoyances like connecting the primary driver's phone after someone else had used the car. Since then, a lot of infotainment features, such as hands-free audio, have developed into separate functionalities. But from the user's point of view, this frequently led to an application layer labyrinth of different menus, systems, and options. Similarly, in terms of architecture, this required utilizing several boxes from various manufacturers for various infotainment systems. Functional consolidation of platforms from various suppliers into a single box is becoming more common these days. Minimizing the various auditory and visual interfaces needed by each successive box results in fewer, simpler user interfaces, in addition to savings on power, space, money, and design complexity. A completely integrated system that momentarily mutes loud music to make room for other audio cues, such as safety warnings, provides a consistent user experience (UX) that can improve the overall in-car experience. Information at Your Fingertips The classic control console with its buttons, sliders, and menus is ergonomically expanded by touch controls. However, modern touch technology does more than just allow for bigger screens with multitouch capabilities. Driving while distracted is made possible via haptic feedback, which is touch-based reaction to commands that vibrates a button to let the user know that the command has been accepted. However, it can also be utilized to produce alarms for safety. For example, in emergency situations, such as when the vehicle is about to swerve off the road, the steering wheel may vibrate. With integrated gesture control in infotainment systems, touch will become obsolete in the future. Currently, drivers may operate a variety of entertainment, navigation, and other car features utilizing touchless hand gestures that don't take their attention away from operating the vehicle, as opposed to gazing down at a screen to locate buttons and other controls. Conclusion  In the end, a good user experience increases safety and convenience by focusing the driver's attention on the road. As a driver can hear and see alerts on a HUD instead of needing to scan an analogue dashboard for flashing lights, reaction is faster and more sophisticated interactions are made feasible compared to only using gauges and controls. When combined with the appropriate supporting technologies, a well-thought-out UX will significantly impact consumers' perceptions of automobiles. An emotive experience produced by an intuitive user interface (UX) fosters a positive and emotional bond between drivers and their cars. In the upcoming decades, automobile user experience (UX) will be a major factor for prospective new car customers, provided it combines ease of use with appropriate technology and components.
Share by: