Thursday, 3 November 2016

Bye, Bye, Blogger. . . .

08:01 Posted by Anonymous 4 comments



“Change is good !” they say.

Over the past few months, we’ve worked hard to share a few wonderful articles with you through this page. This is where some of our wonderful sections such as Geek Speak, Tech Mania and Kit Creatives have been shared. Some of the best achievements of our students and our community has been shared on this page.

But it’s time that we moved on and reached out to more audiences. Things have changed, times have changed. We’ve felt that integrating this page with our own website will give our beautiful readers all the content they need at a more easier location. 

The new blog at Kidobotikz’s own website will help our users get all the content they require at the same page. They can gain access to a lot of new features from the page and can also stay in touch with activities of the vibrant community.

So, as of today we are ending our association with the blogger page. All the old articles will exist on this page though, so that you can relive your memories while we move them back to the new page. 

While the page may change, what hasn't changed is our approach. You can expect the same kind of articles and the same standard of writing that you’ve grown used to

From tomorrow, you can read about us and kidobotikz community at kidobotikz.com/blog

As always, if you have feedback, we'd love to hear it. Please email us at info@kidobotikz.com.

Thank you for supporting what we do.

Wednesday, 2 November 2016

Robotics learning in the age of Open Innovation

10:33 Posted by Anonymous 7 comments

If one has been part of a Multinational Conglomerate with several business verticals, the recent latest fad one can relate to is the term “Open Innovation”. Open innovation as a concept is fairly new and it’s adoption is yet to take off on a massive scale. One can blame this on the mindset of big institutions that are risk averse and tend to lock up their innovation with patents.

While safeguarding ideas with patents may have been lucrative with the royalties that it had earnt the industry until now, it has had a significant impact on the innovation culture within the ecosystem.

Innovation is no longer disruptive. At best it can be termed as innovation that is incremental and gradual. This means that the innovation engine has run dry across most sectors and needs to be restarted.

While the concept of open innovation has been promoted by big players and small players alike, there has been quite a struggle in getting it moving forward. This can attributed to the general climate where innovation and creativity are no longer words of importance outside R&D and product development. The common employee who probably is a part of testing has no medium through which he can express or explore his creative and technical prowess. 

While it is understandable that Industries encourage employees who think outside the box or go the extra mile in bringing uniqueness to the products, more often than not, an employee’s thought processes are locked up in the rigours of routine. 

What is of paramount importance is the discovery of an activity or curriculum that not only promotes thinking creatively, but also help think on lines that bring in an element of futurism to ideas.

This can be made possible only if technologies of the future are embraced and employees are made “early adopters”.

With the advent of regimes such as Industry 4.0, Internet of Things(IoT) and Cyber Physical processes, the industry and employees in particular need to acquaint themselves the nitty-gritties of these technologies. Also, with the “Hardware Revolution” underway, more computing has been extracted from non-conventional platforms with devices such as 3-D printers, SBCs (Single Board Computers) and Microcontrollers coming to the fore.

With such being the case, the knowledge of robotics, and the subjects under it’s gamut- electronics, mechanics, algorithm, programming- provide employees with the right skillsets to think across the spectrum and be capable of delivering innovative solutions for many a problems. More importantly, it removes the need to look for solutions from elsewhere. With a climate that embraces ideas within the system, the solutions can be found within the industry. Even if solutions exist elsewhere, employees empowered with the necessary skills will be able to cross-pollinate them to meet the requirements of firms.

Happy Roboting !

New Horizons completes marathon Pluto data transfer

08:54 Posted by Anonymous No comments

NASA's Deep Space Network (DSN) has received the final piece of data collected by the agency's New Horizons probe during its encounter with the dwarf planet Pluto, which took place on July 14, 2015. Data from the New Horizons mission has revolutionized our understanding of Pluto, revealing the planetoid to be a surprisingly dynamic and active member of our solar system.

Unlike NASA's Dawn mission, which has now spent over a year exploring the dwarf planet Ceres, New Horizons never made orbit around its target. Instead, the probe had only a brief window in which to harvest as much information as possible, before barreling past Pluto into the outer reaches of our solar system.

During the pass, New Horizons' meager power supply of only 200 watts ran a sophisticated suite of seven scientific instruments, which worked to collect over 50 gigabits of data. This information was safely stored in two solid-state digital recorders that form part of the probe's command and data-handling system.

Sending this data back to Earth would prove to be a lesson in patience. New Horizons began transmitting the stored data in September 2015 at a rate of around seven megabits per hour. The information was received back on Earth by NASA's Deep Space Network (DSN).

The transmission was not continuous. The natural spin of the Earth necessitated New Horizons to transmit data in eight hour stints when the DSN was available, resulting in a transfer rate of around 173 megabits per day.

Furthermore, the New Horizons team had to share the capabilities of the DSN with other exploration endeavors such as the Dawn mission, further frustrating the data transmission rate.

However, slow and steady wins the race, and at 5:48 a.m. EDT on the 25th of October, over a year after beginning the process, the DSN station located in Canberra, Australia, relayed the final piece of Pluto data from New Horizons to the probe's mission operations center at the Johns Hopkins Applied Physics Laboratory in Maryland.

The last data transmission contained part of an observation sequence of Pluto and its large moon Charon, as captured by the spacecraft's Ralph/LEISA imager.

"The Pluto system data that New Horizons collected has amazed us over and over again with the beauty and complexity of Pluto and its system of moons," comments Alan Stern, New Horizons principal investigator from Southwest Research Institute in Boulder, Colorado. "There's a great deal of work ahead for us to understand the 400-plus scientific observations that have all been sent to Earth. And that's exactly what we're going to do –after all, who knows when the next data from a spacecraft visiting Pluto will be sent?"

Source: NASANew Atlas

Tuesday, 1 November 2016

Education in 2020: How will it look like?

09:47 Posted by Anonymous 1 comment
When we usually think about the year 2020, we assume of a time somewhere in the future where flying cars would be the norm and contact with alien civilizations would’ve been established. Followers of our former president Shri Abdul Kalam would probably think of the vision he portrayed through the book “Vision 2020”

Of course, in our eternal love for procrastination, most of us forget the fact that 2020 is not the far future. It is very much the near future- just 4 years away!

With such proximity to the promising year, let’s actually examine what are the possibilities that could exist in the third decade of the 21st Century- particularly in the field of education.

Well, we’re no soothsayers or oracles to be able to predict the state of education in the 2020s. But, we actually are capable of is to examine the current trends and plot the graph that ends at the year 2020.

So, what indeed are the trends in the education sector vis-a-vis 2016?

Mobile Learning: Mobile learning, or mLearning. It’s estimated that the mobile learning industry alone will grow to over $37 billion by 2020. Obviously, this is an eLearning trend people just can’t get enough of.


Gamification: Gamification in the course curriculum promotes knowledge retention and also provides a certain degree of entertainment. It does a great job of bringing elements of entertainment and relaxation to any type of online training. It’s possible that the gamification market will top off at about $2.8 billion in 2016.







Video-Based Training: eLearning designers made a wise move when they began to implement video into online learning. Video-based training is becoming so popular that about 98 percent of ALL organizations will include video in their digital learning strategies in 2016.






Big Data: Big Data, for eLearning, would be the data generated by learners who interact with the learning content as a part of their course. This data is collected through Learning Management Systems.






Now, that we have such interesting trends on hand, can we actually get to the task of predicting how the education scenario would like in the future? 

A wild guess couldn’t hurt our chances.

Prediction: Individualized training will be the norm as opposed to classroom based learning. The courses will happen at a pace optimised to match the student’s learning speed. While tutors will continue play a role in the learning process, their role will be limited to that of a supervisor of the learning process while the knowledge in itself will not be imparted by them. It will likely be digitized and taught using interactive video content. The tutors will play a role in taking forward the knowledge of the candidate and direct them towards their areas of strength. The performance and areas of strengths in themselves will be determined using metrics that are not based on memorization of concepts, but application of the concepts.  More importantly, technologies such as Big Data will help evaluate the learning attributes of students and generate reports that will help tutors understand their students better. 

NTU Singapore automates spray painting with PictoBot.

06:03 Posted by Anonymous 25 comments

A new NTU robot will soon be spray-painting the interiors of industrial buildings in Singapore, saving time and manpower while improving safety. Known as PictoBot, the robot is invented by scientists from Nanyang Technological University (NTU Singapore) and co-developed with JTC Corporation (JTC) and local start-up Aitech Robotics and Automation (Aitech). PictoBot can paint a high interior wall 25 per cent faster than a crew of two painters, improving both productivity and safety. Industrial buildings are designed with high ceilings to accommodate bulky industrial equipment and materials. Currently, painting the interiors of industrial buildings requires at least two painters using a scissor lift. Working at such heights exposes painters to various safety risks.

In comparison, PictoBot needs only one human supervisor, as it can automatically scan its environment using its optical camera and laser scanner, to navigate and paint walls up to 10 metres high with its robotic arm. It can work four hours on one battery charge, giving walls an even coat of paint that matches industry standards. Equipped with advanced sensors, Pictobot can also operate in the dark, enabling 24-hours continuous painting. Developed in a year at NTU's Robotic Research Centre, PictoBot is supported by the National Research Foundation (NRF) Singapore, under its Test-Bedding and Demonstration of Innovative Research funding initiative.

The initiative provides funding to facilitate the public sector's development and use of technologies that have the potential to improve service delivery. This is done through Government-led demand projects, where agencies use research findings to address a capability gap and quickly deploy the new technology upon successful demonstration. 

Painting large industrial spaces is repetitive, labour intensive and time-consuming. PictoBot can paint while a supervisor focuses on operating it. The autonomous behaviour also means that a single operator can handle multiple robots and refill their paint reservoirs.

Using PictoBot to automate spray painting helps us mitigate the risks of working at heights when painting high walls typically found in industrial buildings. In addition, it helps to reduce labour-intensive work, thus improving productivity and ensuring the quality of interior finishes. PictoBot is an example of how autonomous robots can be deployed to boost productivity and overcome the manpower constraints that Singapore faces in the construction industry. 

Source: Phys.org 

Sunday, 30 October 2016

Thumb-steered drone leaves you with a free hand

23:00 Posted by Anonymous 2 comments
The shapes and sizes of drones have changed a lot in recent times, but most serious quadcopters are still controlled by way of a dual-joystick controller (autonomous flyers notwithstanding). A new crowdfunding campaign is coming at it from a different angle, by developing a drone that can be flown with a single hand using a stick and thumb ring.

It is true that getting up to speed as a drone pilot using joystick controllers can take some time. The team behind Shift is aiming to give novices an easier way to earn their wings through what it claims is a more intuitive way to fly.

The drone itself has a pretty standard and respectable enough list of specs. It carries a 4K camera that shoots 13-megapixel stills, 8 GB of onboard memory and even a claimed 30 minutes of flight time.


But the controller is something we haven't seen before. It is basically a short fat stick that you hold in one hand and slide your thumb through a ring on top. By moving your thumb around you can then guide the drone through the air: push left and the drone flies left, push forward and the drone flies forward and move up to have the drone increase its altitude. There is also separate toggle on the front that can be used to change the drone's orientation with your index finger.

This does sound like a simpler way to control a drone, but we'd be interested to see how well it works in practice. Our encounters with drones controlled via smartphones and watches quickly reminded us how reliable joysticks are once you get a handle on them, and perhaps there is a reason they have remained the controllers of choice for so long.

We'd also question the value of being able to fly one-handed, which is billed as Shift's big advantage. Piloting a drone can take some concentration, so trying to multitask and make phone calls or take a sip of coffee at the same time might just be a recipe for a busted aircraft.

The team behind Shift is aiming to give novices an easier way to earn their wings

The Shift controller is said to be compatible with some already existing drones including models from Syma and WLtoys, and can be pre-ordered with a camera-less mini-drone. The company hopes to ship in May 2017 if all goes to plan.

Source: New AtlasShift

Saturday, 29 October 2016

Wiring the brain with artificial senses and limb control

23:00 Posted by Anonymous No comments
There have been significant advances in developing new prostheses with a simple sense of touch, but researchers are looking to go further. Scientists and engineers are working on a way to provide prosthetic users and those suffering from spinal cord injuries with the ability to both feel and control their limbs or robotic replacements by means of directly stimulating the cortex of the brain.

For decades, a major goal of neuroscientists has been to develop new technologies to create more advanced prostheses or ways to help people who have suffered spinal cord injuries to regain the use of their limbs. Part of this has involved creating a means of sending brain signals to disconnected nerves in damaged limbs or to robotic prostheses, so they can be moved by thought, so control is simple and natural.

However, all this had only limited application because as well as being able to tell a robotic or natural limb to move, a sense of touch was also required, so the patient would know if something has been grasped properly or if the hand or arm is in the right position. Without this feedback, it's very difficult to control an artificial limb properly even with constant concentration or computer assistance.

Bioengineers, computer scientists, and medical researchers from the University of Washington's (UW) GRIDLab and the National Science Foundation Center for Sensorimotor Neural Engineering (CSNE) are looking to develop electronics that allow for two-way communication between parts of the nervous system.

The bi-directional brain-computer interface system sends motor signals to the limb or prosthesis and returns sensory feedback through direct stimulation of the cerebral cortex – something that the researchers say they've done for the first time with a conscious patient carrying out a task.

In developing the new system, volunteer patients were recruited, who were being treated for a severe form of epilepsy through brain surgery. As a precursor to this surgery, the patients were fitted with a set of ElectroCorticoGraphic (ECoG) electrodes. These were implanted on the surface of the brain to provide a pre-operational evaluation of the patient's condition and stimulate areas of the brain to speed rehabilitation afterwards.

According to the UW, this allowed for stronger signals to be received than if the electrodes were placed on the scalp, but wasn't as invasive as when the electrodes are put into the brain tissue itself.

While the electrode grid was still installed, the patients were fitted with a glove equipped with sensors that could track the position of their hand, use different electrical current strength to indicate that position and stimulate their brain through the ECoG electrodes.

The patients then used those artificial signals delivered to the brain to "sense" how to move their hand under direction from the researchers. However, this isn't a plug-and-play situation. The sensation is very unnatural and is a bit like artificial vision experiments of the 1970s where blind patients were given "sight" by means of a device that covered their back and formed geometric patterns. It worked in a simple way, but it was like learning another language.

"The question is: Can humans use novel electrical sensations that they've never felt before, perceive them at different levels and use this to do a task?," says UW bioengineering doctoral student James Wu. "And the answer seems to be yes. Whether this type of sensation can be as diverse as the textures and feelings that we can sense tactilely is an open question."

For the test, three patients were asked to move their hand into a target position using only the sensory feedback from the glove. If they opened their fingers too far off the mark, no stimulation would occur, but as they closed their hand, the stimulus would begin and increase in intensity. As a control, these feedback sessions would be interspersed with others were random signals were sent.

According to the team, the hope is that one day such artificial feedback devices could lead to improved prostheses, neural implants, and other techniques to provide sensation and movement to artificial or damaged limbs.

"Right now we're using very primitive kinds of codes where we're changing only frequency or intensity of the stimulation, but eventually it might be more like a symphony," says Rajesh Rao, CSNE director. "That's what you'd need to do to have a very natural grip for tasks such as preparing a dish in the kitchen. When you want to pick up the salt shaker and all your ingredients, you need to exert just the right amount of pressure. Any day-to-day task like opening a cupboard or lifting a plate or breaking an egg requires this complex sensory feedback."

The research will be published in IEEE Transactions on Haptics.