About the API:
Our Hand Recognition API is a powerful tool for detecting and tracking hands in images. The API is designed to work with input images that are decodable and have appropriate aspect ratios. The API processes the image and outputs detailed information about the hands that it detects.
The output includes the coordinate frame of each hand, which provides the location and orientation of the hand in the image. Additionally, the API outputs 21 bone node coordinate information for each hand. This detailed information can be used for a variety of applications such as virtual and augmented reality, human-computer interaction, and gesture recognition.
The API uses cutting-edge computer vision algorithms to analyze the image and detect hands with high accuracy. It is able to handle a wide range of lighting conditions, hand poses, and backgrounds, making it a versatile tool for any application that requires hand recognition.
The API can be easily integrated into your existing system, whether it's a mobile app, a website, or a standalone application. It is designed to be user-friendly and easy to use, making it accessible to developers of all skill levels.
Overall, Our Hand Recognition API is a powerful and versatile tool for detecting and tracking hands in images. With its detailed output and easy-to-use interface, it is the perfect solution for a wide range of applications that require hand recognition.
Pass the image URL of your choice and retrieve the information recognized by the hand in the picture.
Virtual and Augmented Reality: Use the API to track and interpret hand gestures, allowing for more natural and intuitive interaction with virtual environments.
Human-computer interaction: Use the API to enable hand gestures as input for controlling devices and applications, providing an alternative to traditional input methods such as mouse and keyboard.
Sign Language Recognition: Use the API to detect and interpret hand gestures in sign language, making communication more accessible for the deaf and hard of hearing.
Gaming: Use the API to track hand movements and interpret them as in-game actions, allowing for more immersive and interactive gameplay.
Robotics: Use the API to interpret hand gestures as commands for controlling robotic systems, allowing for more natural and intuitive human-robot interaction.
Medical research: Use the API to track and analyze hand movements in patients with conditions that affect motor skills, such as Parkinson's disease, to study and understand the progression of the disease.
Besides API call limitations per month, there are no other limitations.
Pass the Image URL of the hand from where you want to extract the information and coordinates.
Hand Recognition - Endpoint Features
| Object | Description |
|---|---|
imageUrl |
[Required] |
{"code":0,"data":{"hand_info":[{"hand_parts":{"4":{"y":204,"x":486,"score":0.81871610879898},"10":{"y":321,"x":454,"score":0.81764525175095},"5":{"y":242,"x":422,"score":0.63888543844223},"11":{"y":359,"x":491,"score":0.79886507987976},"12":{"y":390,"x":523,"score":0.81205058097839},"7":{"y":321,"x":497,"score":0.83726966381073},"18":{"y":343,"x":391,"score":0.81639093160629},"13":{"y":305,"x":380,"score":0.67881578207016},"0":{"y":226,"x":263,"score":0.59736984968185},"8":{"y":353,"x":529,"score":0.8176703453064},"19":{"y":364,"x":422,"score":0.78116250038147},"9":{"y":274,"x":406,"score":0.72501480579376},"6":{"y":289,"x":470,"score":0.82305908203125},"16":{"y":396,"x":497,"score":0.85061377286911},"1":{"y":173,"x":327,"score":0.49955746531487},"3":{"y":194,"x":433,"score":0.7212952375412},"17":{"y":321,"x":353,"score":0.74342161417007},"2":{"y":167,"x":385,"score":0.66624820232391},"14":{"y":343,"x":428,"score":0.8819363117218},"15":{"y":369,"x":465,"score":0.86385977268219},"20":{"y":390,"x":454,"score":0.85869860649109}},"location":{"top":167,"height":229,"score":16.048545837402,"left":263,"width":266}}],"hand_num":1},"message":"success"}
curl --location --request POST 'https://zylalabs.com/api/1102/hand+recognition+api/960/hand+recognition?imageUrl=https://uploads-ssl.webflow.com/577065f4e06b550b0c190c5c/583bb3ca5b8693a10835b1f3_Sophie%27s%20hand_BEN7244.jpg' --header 'Authorization: Bearer YOUR_API_KEY'
| Header | Description |
|---|---|
Authorization
|
[Required] Should be Bearer access_key. See "Your API Access Key" above when you are subscribed. |
No long-term commitment. Upgrade, downgrade, or cancel anytime. Free Trial includes up to 50 requests.
The Hand Recognition API returns detailed information about detected hands in images, including the coordinate frame for each hand and the coordinates of 21 bone nodes, which represent key points on the hand.
The key fields in the response data include "hand_info," which contains an array of detected hands, and "hand_parts," which provides the coordinates (x, y) and confidence scores for each of the 21 bone nodes.
The response data is structured as a JSON object. It includes a "code" indicating the status of the request and a "data" object containing "hand_info," which lists the detected hands and their corresponding bone node coordinates.
The API provides information on hand detection, including the location and orientation of each hand, as well as detailed coordinates for 21 specific points on the hand, useful for applications like gesture recognition and virtual interaction.
Users can customize their requests by providing different image URLs to the POST Hand Recognition endpoint. The API processes the specified image and returns hand detection data based on the content of that image.
Typical use cases include virtual and augmented reality applications for gesture tracking, human-computer interaction for alternative input methods, gaming for immersive experiences, and medical research for analyzing hand movements in patients.
The Hand Recognition API employs advanced computer vision algorithms that are designed to handle various lighting conditions and hand poses, ensuring high accuracy in hand detection and tracking across diverse scenarios.
Users can expect a consistent JSON structure with a "code" field and a "data" object. Each detected hand will have a corresponding "hand_parts" object containing coordinates and scores, indicating the reliability of each detected point.
To obtain your API key, you first need to sign in to your account and subscribe to the API you want to use. Once subscribed, go to your Profile, open the Subscription section, and select the specific API. Your API key will be available there and can be used to authenticate your requests.
You can’t switch APIs during the free trial. If you subscribe to a different API, your trial will end and the new subscription will start as a paid plan.
If you don’t cancel before the 7th day, your free trial will end automatically and your subscription will switch to a paid plan under the same plan you originally subscribed to, meaning you will be charged and gain access to the API calls included in that plan.
The free trial ends when you reach 50 API requests or after 7 days, whichever comes first.
No, the free trial is available only once, so we recommend using it on the API that interests you the most. Most of our APIs offer a free trial, but some may not include this option.
Yes, we offer a 7-day free trial that allows you to make up to 50 API calls at no cost, so you can test our APIs without any commitment.
Zyla API Hub is like a big store for APIs, where you can find thousands of them all in one place. We also offer dedicated support and real-time monitoring of all APIs. Once you sign up, you can pick and choose which APIs you want to use. Just remember, each API needs its own subscription. But if you subscribe to multiple ones, you'll use the same key for all of them, making things easier for you.
Please have a look at our Refund Policy: https://zylalabs.com/terms#refund
Service Level:
100%
Response Time:
2,610ms
Service Level:
100%
Response Time:
820ms
Service Level:
100%
Response Time:
1,104ms
Service Level:
100%
Response Time:
949ms
Service Level:
100%
Response Time:
855ms
Service Level:
100%
Response Time:
4,650ms
Service Level:
100%
Response Time:
1,428ms
Service Level:
100%
Response Time:
0ms
Service Level:
100%
Response Time:
814ms
Service Level:
100%
Response Time:
6,450ms
Service Level:
100%
Response Time:
508ms
Service Level:
100%
Response Time:
329ms
Service Level:
100%
Response Time:
298ms
Service Level:
100%
Response Time:
298ms
Service Level:
100%
Response Time:
4,427ms
Service Level:
100%
Response Time:
0ms
Service Level:
100%
Response Time:
2,730ms
Service Level:
100%
Response Time:
171ms
Service Level:
100%
Response Time:
5,740ms
Service Level:
100%
Response Time:
1,323ms