Current-generation VR controllers are fairly limited in enabling users to “feel” virtual objects; bow strings vibrate with haptic shivers, while guns kick back as they fire bullets.Now researchers at Russia’s Skolkovo Institute of Science and Technology are proposing a big step forward called TouchVR — a wearable accessory that applies direct force on the palm and vibrotactile feedback to the fingers, enabling users to feel the weight, texture, softness, and slippage of VR objects.Each TouchVR wearable looks like the foundation of an Iron Man glove: a circular DeltaTouch 3D force generator centered on the palm, plus vibration motors wired with Velcro pads to the thumb and fingers around it.So equipped, the wearer can feel applied force and sliding motions in the palm, combined with vibrations that run from the palm to the fingertips to simulate object textures.There’s no need to hold another controller, as hands are tracked with a Leap Motion hand sensor and HTC Vive Pro VR system.The researchers are using several Unity-based VR apps to demonstrate TouchVR’s capabilities, including a virtual spider moving on the wearer’s palm, a bouncing soccer ball, and a pulsing dragon’s egg.
Some time ago, there were complaints that the Lenovo Z5s was facing touch response issues.Now, the latest announcement from the company shows that it has a solution.According to ZUI official Weibo, Lenovo Z5s ZUI 11.1.171 stable version is now available.The upgrade package size is 153.15 MB and it comes with a fix for the occasional touch failure.This update is currently rolling out in China.The Lenovo Z5s was released in December 2018 and it comes with a 6.3-inch LCD waterdrop display.
Fans of the blockbuster movie "Iron Man 3" might remember the characters step inside the digital projection of a "big brain" and watch as groups of neurons are "lit up" along the brain's neural "map" in response to physical touch.Now, much like that scene, researchers at the University of Missouri have discovered a new insight into how the complex neural map of the human brain operates."When a person touches something with their right hand, a specific 'hand area' in the left side of the brain lights up," said Scott Frey, the Miller Family Chair in Cognitive Neuroscience in the Department of Psychological Sciences.This is a striking example of functional reorganization or the plasticity of the human brain."They created a computer-controlled, air-based system to deliver light touch to the hands and face.Functional MRI scans are similar to traditional MRI scans but are sensitive to tiny changes in blood oxygenation levels in the brain that occur when areas of the brain are processing information.
Researchers from Stanford University have created a new tactile display that aims to allow the blind and visually impaired to access 3D printing and computer-aided design (CAD).The touch-based display can mimic the geometry of the 3D objects on a computer.The researchers say that while design tools empower users to create and contribute to society, the tools also limit those who can participate.The work is part of a larger effort in the lab of Sean Follmer, an assistant professor of mechanical engineering, to develop tactile displays.The display that the researchers have created looks a bit like the toys that many may have seen at retail stores.These toys tend to have lots of pins and can replicate whatever is pushed under them from behind, and the pins make a 3D shape on its surface.
Tactile Mobility, a tactile virtual sensing and data company headquartered in Haifa, Israel, today announced that it’s secured $9 million in funding from Porsche, Union Tech Ventures (the technology investment arm of the Union Group), and previous investors.Tactile Mobility chairman and managing partner of LAIG investments Jorge De Pablo noted that the company is collaborating with “several” global OEMs (including Ford and Porsche) and road authorities across the U.S., Europe, and Asia, and he expects the capital infusion will accelerate product R on the eve of go-to-market operation expansion.Tactile Mobility, which was founded in 2012 by Boaz Mizrachi, Yossi Shiri, and Alex Ackerman, provides tactile sensing and data analytics solutions for smart and autonomous vehicles, municipalities, and fleet managers.An in-vehicle software module running on an ECU or aftermarket device collects data generated by a car’s non-visual sensors, and it applies AI models to generate insights including (but not limited to) road quality, tire grip, RPM, paddle and gear position, wheel angle and speed, vehicle weight, and other vehicle- and road-specific metrics.The insights are fed into onboard computers to optimize driving decisions before they’re uploaded to Tactile Mobility’s cloud, which produces a real-time and anonymized map of road features like grades, banking, curvature, normalized grip levels, and the location of bumps and potholes.Bespoke models handle tasks such as data ingestion, fusion, noise cleaning, and processing, in theory ensuring high accuracy without the need for human oversight.
With the goal of increasing access to making, engineers at Stanford University have collaborated with members of the blind and visually impaired community to develop a touch-based display that mimics the geometry of 3D objects designed on a computer.Even with 3D modeling software that has more accessible ways of inputting designs, they still have to evaluate their work by either creating a physical version they can touch or by listening to a description provided by a sighted person."Design tools empower users to create and contribute to society but, with every design choice, they also limit who can and cannot participate," said Alexa Siu, a graduate student in mechanical engineering at Stanford, who developed, tested and refined the system featured in this research."This project is about empowering a blind user to be able to design and create independently without relying on sighted mediators because that reduces creativity, agency and availability."This work is part of a larger effort within the lab of Sean Follmer, assistant professor of mechanical engineering, to develop tactile displays - displays that relay information through touch - for various purposes, such as human-computer interaction and new ways of sharing or explaining 3D information.Although the display she presented is a prototype, the lab hopes to make a version that is less expensive, larger and able to create shapes in greater detail.
I've seen touch-sensitive areas on phones and wearable devices before.That's exactly what San Jose startup Sentons is trying to do, with ultrasonics that work like sonar to register touch and pressure anywhere on a gadget's surface.Sentons already has one phone using its technology: the Asus ROG Phone II gaming phone has "air triggers" at the top that are pressure-sensitive touch zones.Vibrating haptics give feedback when they're pressed.This ultrasonic technology could work on any surface, or any shape, in what the company says could be a wide range of materials: metal, wood, leather.I ask about whether this tech could even become some sort of future replacement for Apple's now-departed 3D Touch screen tech.
Sentons has unveiled touch and gesture sensors for use in consumer devices, with new types of user interfaces and controls.The San Jose, California-based company makes Software-Defined Surfaces (SDS), and its latest examples are the new SurfaceWave Processor and Gesture Engine.Sentons’ ultimate goal is to bring SDS technology to every glass, plastic, and metal surface by combining ultrasonic touch and strain-gauge sensors.With Sentons integration, any consumer electronic device — from a smartwatch to a car dashboard — can become interactive and definable by applications.Like the Marvel superhero Daredevil, Sentons uses enhanced audio acuity to detect not only touch, but force, intent, and subtle nuances that exceed even the capability of capacitive touch sensors.Asus and Tencent worked with Sentons on the ROG II, a new gaming phone featuring Air Triggers — software-defined virtual buttons enabling the phone to be used similarly to a video game controller.
As handset makers continue to work on ways of making smartphones more streamlined and sleek, while at the same time introducing new features that will get people buying more devices, a startup that is pioneering something called “software-defined” surfaces — essentially, using ultrasound and AI to turn any kind of material, and any kind of surface, into one that will respond to gestures, touch and other forces — is setting out its stall to help them and other hardware makers change up the game.Sentons, the startup out of Silicon Valley that is building software-defined surface technology, is today announcing the launch of SurfaceWave, a processor and accompanying gesture engine that can be used in smartphones and other hardware to create virtual wheels and buttons to control and navigate apps and features on the devices themselves.The SurfaceWave processor and engine are available to “any mobile manufacturer.”Before this, Sentons had actually already inked direct deals to test out market interest in its technology.Sentons has actually been around since 2011 but very much under the radar until this year, when it announced that Lee — who had been at Apple, after his previous company, the cutting-edge imaging startup InVisage, was acquired by the iPhone maker — was coming on as CEO.(Given the company’s partnership with Tencent and Asus, those are two companies I would think are candidates as strategic investors.)
Finally, since Android 10 is poised to completely overhaul Android’s gesture support, we’ll be looking at how you can update your applications to support Android’s new gesture-based navigation, including how to ensure your app’s own gestures don’t conflict with Android 10’s system-wide gestures.Android gestures can be divided into the following categories:These allow the user to move around your application, and can be used to supplement other input methods, such as navigation drawers and menus.A MotionEvent also describes the touch event’s state, via an action code.In the following code, I’m also using getActionMasked() to retrieve the action being performed:public class MainActivity extends AppCompatActivity {
The company had previously experimented with a touch bar on its MacBook Pro.This is a modal window.Apple, it seems enthused by the touchscreen keyboard concept, even though i'ts last halfway attempts at a touchscreen keyboard, the MacBook Pro with a touch bar, failed miserably.However, the company has changed its approach to the touchscreen concept, according to a new patent.Instead of replicating smartphone-style touchscreen on its laptops, the company is working on a touchscreen keyboard, with the feel of a real one, according to its patent application, which was discovered on Thursday by Patently Apple.It is a paradigm similar to the home button on the new iPhones, which is not mechanical, yet retains the feel of a click using a haptic motor.
Redmi 8A review: A whole new designWhile the Redmi 7A got minor tweaks, Xiaomi brought the Redmi 8A up to speed with 2019 design trends.I took it out and about and, for the most part, I didn’t have any issues with viewing it outdoors.You’ll have to buy a separate charger for this; a 10W brick is included in the box.Audio output from the headphone jack is loud and clear, though with my 1More Triple Driver earphones I noticed a slight amount of hiss and lack of dynamic range.Music sounds muffled with little distinction between lows and mids.
Is Apple’s Touch Bar just not cutting it for your Final Cut Pro or Premiere Pro editing sessions?The Sensel Morph may be the touch-based solution you’ve been looking for.This multitouch, pressure-sensing accessory looks like a giant trackpad, but place one of several rubber “overlays” on it and it turns into a bespoke keyboard for any number of different creative applications.The Morph arrived two years ago on the wings of a successful crowdfunding campaign, but the video-editing overlay is new.A Morph overlay is basically a rubber pad with some buttons or other types of controls stamped out of it.Controls can be remapped and customized within the Sensel desktop app, making the overlay appropriate for other editing applications like Final Cut Pro or DaVinci Resolve.
Samsung seems to be on a trajectory that will see its iconic rotating bezel disappear from its smartwatches.For the second time in a row, the Galaxy Watch Active 2 sported no such control, replacing it with thin, classy-looking borders instead.Samsung did make one compromise and made those bezels touch-sensitive.Strangely, it shipped the smartwatches with that feature disabled out of the box and it is now pushing out an update that flips the switch for everyone.It’s still puzzling that Samsung would ship the Galaxy Watch Active 2 with one of its key features disabled.Then again, there are also other key features that won’t be enabled until next year.
Swiss researchers have developed a wearable, sensor-packed second skin that could let VR users ‘touch’ objects in virtual worlds.Virtual reality has come on in leaps and bounds over the years, but tactile sensations have been notably by either their absence or their crudity.Thanks to researchers at the École Polytechnique Fédérale de Lausanne's (EPFL) Reconfigurable Robotics Lab (RRL) and the Laboratory for Soft Bioelectronic Interfaces (LSBI), however, this could be about to change.In a paper entitled Closed-Loop Haptic Feedback Control Using a Self-Sensing Soft Pneumatic Actuator Skin in the Soft Robotics journal, scientists reveal details of an ultra-thin second skin that could be worn by a VR user.At just 500 nanometers thick, the artificial skin is far more sophisticated and less obtrusive than existing haptic feedback systems.A series of soft sensors and actuators have been designed to create a realistic sense of touch, something which is helped by constantly measuring skin deformation with strain sensors.
It’s been almost three years since Apple introduced the Touch Bar to the world, and it’s still searching for a purpose.While it’s not any worse than the Function keys it replaced, Apple has really struggled to help it reach its potential and make it a truly excellent part of owning a MacBook Pro.Surely, there are some killer apps for the Touch Bar?We’ve rounded up four Touch Bar apps and tools that make it super useful.If you’re using an app and the developer hasn’t added any Touch Bar functionality, then the Touch Bar is useless.BetterTouchTool fixes that by putting you firmly in control.
Inspired by octopuses, researchers have developed a structure that senses, computes and responds without any centralized processing - creating a device that is not quite a robot and not quite a computer, but has characteristics of both."We call this 'soft tactile logic,' and have developed a series of prototypes demonstrating its ability to make decisions at the material level - where the sensor is receiving input - rather than relying on a centralized, semiconductor-based logic system," says Michael Dickey, co-corresponding author of a paper on the work and Alcoa Professor of Chemical and Biomolecular Engineering at North Carolina State University."Our approach was inspired by octopuses, which have a centralized brain, but also have significant neuronal structures throughout their arms.This raises the possibility that the arms can 'make decisions' based on sensory input, without direct instruction from the brain."That pigmented silicone contains channels that are filled with metal that is liquid at room temperature, effectively creating a squishy wire nervous system.Pressing or stretching the silicone deforms the liquid metal, which increases its electrical resistance, raising its temperature as current passes through it.
The newly-unveiled iPhone 11 and iPhone 11 Pro do not include the 3D Touch feature found in past Apple products.3D Touch has officially been phased out of the new lineup and replaced with "Haptic Touch," a less complex feature.One subtle change wasn't highlighted at Apple's expo, but appears to be going into effect: Apple is discontinuing 3D Touch, which had been an iPhone feature since 2015.3D Touch allowed for different levels of responsiveness based on how hard users pressed down on their iPhone screen — for instance, pressing hard on an app icon would open a dropdown menu, while gently tapping the app would open it.Here's a look at the life and death of 3D Touch, and how it stacks up to Haptic Touch on the iPhone 11 and iPhone 11 Pro.It indicates an expandable section or menu, or sometimes previous / next navigation options.
What factors affect how human touch perceives softness, like the feel of pressing your fingertip against a marshmallow, a piece of clay or a rubber ball?By exploring this question in detail, a team of engineers and psychologists at the University of California San Diego discovered clever tricks to design materials that replicate different levels of perceived softness.The findings provide fundamental insights into designing tactile materials and haptic interfaces that can recreate realistic touch sensations, for applications such as electronic skin, prostheses and medical robotics.In doing so, we are helping close the gap in understanding what it takes to recreate some aspects of touch," said Charles Dhong, who co-led the study as a postdoctoral fellow at UC San Diego and is now an assistant professor in biomedical engineering at the University of Delaware.Based on the results from their experiments, the researchers created equations that can calculate how soft or hard a material will feel based on material thickness, Young's modulus (a measure of a material's stiffness), and micropatterned areas.It is a factor, but now we show that it's only one part of the equation."
The ubiquitous virtual keyboards found on smartphones, tablets, and other touchscreen devices might someday be replaced by an invisible equivalent, if researchers at the Korea Advanced Institute of Science and Technology have their way.In a fascinating study published on the preprint server Arxiv.org this week (“I-Keyboard: Fully Imaginary Keyboard on Touch Devices Empowered by Deep Neural Decoder“), they propose a “fully imaginary” keyboard — the I-Keyboard — lacking a predefined layout, shape, or size, that taps AI to detect typing from any position at any angle.Novelly, it doesn’t require calibration, and the researchers claim that most people manage to achieve 95.84% typing accuracy with it compared with a conventional virtual keyboard.“Contemporary soft keyboards possess a few limitations.In fact, current soft keyboard techniques damage the usability of mobile devices in multiple ways other than the mobility,” wrote the coauthors, who point out that the lack of tactile feedback generally increases the rate of typos,” the paper reads.Mobile devices provide smaller displays than non-mobile devices in general and soft keyboards can fill up to 40% of displays.”
More

Top