ict google big query vs hadoop trending new technologies
TRANSCRIPT
TECHNOLOGY
ICT WITHIN SPORTS AND PHYSICAL EDUCATION COMBINING THE “TWITTER” AND “APP”PAGE 4
FACULTY CORNER
Google Big Query VsHadoop PAGE 6
NEW @ IT
TRENDING NEW TECHNOLOGIESPAGE 16
June- December 2013 the Bullettin of Information Technology
http://www.rajagiritech.ac.in
Department of Information TechnologyRajagiri School of Engineering & Technology
Rajagiri Valley, Kakkand, Kochi
the BITINFORMATION TECHNOLOGY GENERATION CLOUD COMPUTING WIRELESS NETWORK ARCHITECTURE
KUTTYAMMA A JPROFESSSOR & HOD, DIT
FA C U LT Y M OT I VAT I O N - N E E D O F T H E D AYFaculty is the most important factor of Education. To make our institutions academically
vibrant and socially justified we need highly motivated quality faculty.
Before discussing the various methods to motivate our faculty the present scene is to be ana-
lyzed. To deal with the present younger generation which is full of enthusiasm, emotions and
eagerness is a tough job. Moreover they have the access to a world of information technol-
ogy which give them a new world of knowledge, creativity and information. We are living in
an era where we see changes and developments taking place every minute at rapid pace.
Now it is not easy for a teacher to gather information, process it and make it conducive and
bear the responsibility of delivering it to students.
According to Maharshi Aurabindo “the principle of true teaching is that nothing can be taught.
The teacher is not an instructor or a task master; he is a helper and a guide. His business
is suggest and not to impose. The chief aim of education should be to help the growing soul
to draw out that in itself which is best and make it perfect for a noble use.”
The role of the teacher is to facilitate the student in acquiring the knowledge and informa-
tion which is available in the world. A teacher must be a serious scholar, and should perform
his role with a complete sense of belongingness to the profession and to the pupil. He must
also be imbued with and enthused with a certain amount of austerity and renunciation, deep
humanity and tolerance.
It has been observed that motivation is one of the foremost problems in education and it is
often inadequately addressed. There is a need for continuous training-cum-capacity building
programmes for teachers at all cadre levels. It is important to groom, nurture and develop
teachers for imparting quality education at higher level. A course can be designed for the
persons aspiring to become teachers in higher education. During the course, proper empha-
sis should be given on ones mental framework, teaching skills, communication skills, use of
technology and above all motivation.
Groom faculty through Mentoring is one way of motivation. Visit to Universities, Higher edu-
cation institutions and Industries in India and abroad/ active dialogue with subject related
professionals and stake holders can also be a motivating factor for the faculty. Rewards/com-
plements to faculty for their academic accomplishments is another way of motivating them.
Involvement in institutional developmental activities and giving importance to their sugges-
tions can motivate faculty.
Many more ways are there for motivating faculty. Some measures are person specific and
some are institution specific. We have to find the proper way of motivating the faculty and
consider this as a high priority matter.
I conclude by quoting the words of Terrell H. Bell, former US Secretary of Education, “There
are three things to remember about education. The first one is motivation. The second one
is motivation. The third one is motivation.”
the BITthe Bullettin of Information Technology
ON CREATIVE DESK
Editors
Prof. Kuttyamma A.J. (HOD- Department of Information Technology)
Jisha G Assistant Professor
Student Editors
Lakshmi Ramesh S8 IT
Illustrations
Krishnadas Naduvath Programmer
Photo Courtsey
Google Images
page 4
page 14
page6 page 20
page 7
C o n t e n t s
PAGE 4 INFORMATION COMMUNICATION TECHNOLOGY(ICT) WITHIN SPORTS
PAGE 6 GOOGLE BIG QUERY VS HADOOP;
PAGE 7 WHAT IS BIG DATA?;
PAGE 9 CHALLENGES AND OPPORTUNITIES WITH BIG DATA;
PAGE 11 MOBILE SECURITY;
PAGE 12 E-SAP;
PAGE 13 SKIN PUT TECHNOLOGY;
PAGE 14 COMING SOON...A FULLY ENCRYPTED INTERNET;
PAGE 15 TONGUE DRIVE SYSTEM TO OPERATE COMPUTERS;
PAGE 16 EVENTS @ IT DEPARTMENT
t h e B I T, t h e B u l l e t t i n o f I n f o r m a t i o n Te c h n o l o g y 3
I N F O R M AT I O N C O M M U N I C AT I O N T E C H N O L O G Y ( I C T ) W I T H I N S P O RT S A N D P H Y S I C A L E D U C AT I O N :
C O M B I N I N G T H E “ T W I T T E R ” A N D “A P P ” Rejeesh T.Chacko, Assistant Professor, PHYSICAL EDUCATION
Introduction
The use of information communication technology (ICT) in the modern world has helped the human race improve many things
and has claimed to improve thinking,communication, and prob-lem-solving skills through a wide range of software and input de-vices.ICT rapidly updating and changes over time.The use of ICT in this century is an integral part of everyday life .Whether communicating,accessing,uploading and downloading data across the vast spectrum of devices and platform in use to-day.It is needed that education responds according to this technologi-cal demand, ensuring that what is being taught and learnt in insti-tutions across the country remains relevant and engaging.Sports and Physical Education have the potential not only to touch the lives of the individuals but also to positively affect the society as a whole and its is an integral part of the total educa-tion process. The use of ICT within education can aid the develop-ment of Physical education and sports that are competent and confident in using a vast area of digital media and technology to enhance their sports and physical education within and beyond compulsory education.Recent advancements in ICT have allowed for the use of mobile technology and micro blogging technology within education to become a reality. Mobile apps and micro blogging website such
as Twitter ‘have the potential to really enhance the learning experience within education. This article outlines how this
technology can be used to improve the teaching and learning experiences and also promote the sports & physical educa-tion globally.
Mobile Apps and Micro Blogging TechnologyLearning is no longer confined to a particular fixed location but instead asa result of mobile technology access to educa-tional content is available at any time,any where around the world.The use of apps and micro blogging can enable active engage-ment and personalized learning by curiosity and interest. Cre-ating ICT opportunities that encourage pupils to interact and engaged with the learning content, both within and beyond the classroom, improves the impact and reach of education. The use of apps and micro blogging can also promote independence create effective assessment and stimulate out of classroom hours learning.A number of mobile apps that are ideal for promoting physical literacy within physical education are outlined on the coming lines.The Power of Micro BloggingMicro blogging allow users to exchange small elements of contents such as short sentences,individual images,videos
4 t h e B I T, t h e B u l l e t t i n o f I n f o r m a t i o n Te c h n o l o g y
F a c u l t y C o r n e r
or links to external websites,documents and media.Micro blogs can reach a large or specific audience with relatively low admin,this make information widely avail-able from numerous sources around the world.Twitter for Teachers Twitter is an online social network-ing and micro blogging service that en-able its users to send and read text board messages of up to 140 characters , known as “tweets”. There are a number of ways twitter can help teachers . Twitter fa-cilitates effective networking with other professionals,enables the sharing of infor-mation’s ,discussion topic to future knowl-edge and share best practices and much more.Twitter for Students Twitter can be used by students in a wide range.Twitter can be used as news and information,students can be create micro blog to demonstrate knowledge o r askquestions,students can use twitter for revision and discussion topics to as-sist with home work/course work and to network with other student individual or group.Assisting APPSWith the rise of mobile devices and app based learning, there is now an enormous selection of affordable tools for PE teach-ers to choose from such as apps for video analysis, replay, tagging, communication, assessment, health tracking and so forth. The ability to have all of these tools in one device that fits in your hand has trans-formed the ability to use technology on the fly.There has been a thousand fold increases between 2011-2012 on Google searches for apps related to PE. This shows that the demand for technology integration is not limited to the traditional classroom and is expanding to various other subject areas. PE teachers have often been overlooked in the push to use more technology in the classroom and it is great to see that more resources are opening up to them.
Here is a list of some of the apps for Sports and Physical Education1) Coaches Eye
This is an essen-tial app for any PE teacher or coach who wants to capture what stu-dents are doing and guide them to improve. The app
enables you to record a video and then do a voiceover together with screen annota-tions. Video analyses can easily be saved and shared, making them ideal for provid-ing students and parents with feedback on how the child can improve. This app, although designed primarily for coaching, would also prove useful in any classroom.
2) Coach NoteCoach Note is a must have app for coaches who want to go over plays with play-ers. There are tem-plates for multiple sports and you
can also add your own custom ones. With a variety of tools and the ability to record and share your coaching, the app is quite powerful. One suggestion for using this app is to have a portable projector so that you can coach small groups of students during a game or on the go. 3) Pocket Body
A fully searchable inter-active atlas of human body.
4) CardiographCardiograph is an application that measure your heart rate
5) Giant Score BoardA giant score board featuring timer, countdown (edit-able), team names ( e d i t a b l e ) , g i a n t digits, fits any
sport,simple,accurate and reliable.6) EducreationsEducreations turns your ipad /mobile into
a recordable white board .D iagrams and sports,video playbacks, voice recording, realistic digital ink,photo imports and simple sharing through
email,facebook or twitter.7) Map by RunTrack and save your running route can be
given into students to promote indepen-dence whilst run-nings.can be used as a motivational tool outside and within lesson.8) Sprint Timer
This is a great photo finish app which uti-lizes the same technologies as profession-als to show the winners and their times.This app can be activated by the start gun or whistle and provide either a photo or video finish.
Summary
The use of app and micro blogging can aid teaching and learning experiences if used effectively. The use of micro blogging and mobile apps can compliment teaching strategies in creating innovative, effec-tive and engaging learning experience in Sports ,Physical Education as well as in other disciplines .
t h e B I T, t h e B u l l e t t i n o f I n f o r m a t i o n Te c h n o l o g y 5
G o o g l e B i g Q u e r y V s H a d o o p
NIKHILA T BHUVAN, ASST. PROFESSOR, IT
Managing HUGE amount of data and big-data
processing is always a problem. The open-source
framework, Hadoop has become increasingly
popular over the past few years as a way for
entities to crunch massive amounts of data
stored on large hardware clusters. Developers
and companies can deploy Hadoop on their own
infrastructure, or run it via the cloud. The origin
of Hadoop was based on some whitepapers
that Google released in 2004, which introduced
MapReduce and the Google File System (GFS).
In 2006, Doug Cutting and Michael J. Carafella
used Google’s whitepapers to create Hadoop
based on MapReduce and HDFS. The Apache
foundation adopted Hadoop and released it to
the world. Hadoop has today developed into a
really strong product with a massive ecosystem
of tools.
However, when making the decision to use
Hadoop, there does seem to be a bit of an
elephant in the room with Google’s Dremel
and BigQuery products. Google markets these
as a different and more comprehensive way
to distribute, parse and analyze data than with
Hadoop. BigQuery is based on a product that
Google invented a few years ago called Dremel,
which is designed as a way to query extremely
large datasets in real time. Sounds great – but
why the industry is carrying on with Hadoop
when Google has gone its own way.
There are some flaws in Hadoop’s approach to
distributed computing that they need to improve
on. The biggest flaw is that Hadoop processes
things in batch -not real time. The nature and
speed of the online world where customer trend
analysis or real time recommendation engines
need to analyze and produce instant results
means that Hadoop’s batch approach isn’t going
to be fast enough. Using Hadoop is troublesome
in processing massive amounts of data and
use these results in real time to affect business
decisions.
Google’s Dremel is a query service that allows
you to run SQL-like queries against very, very
large data sets and get accurate results in
mere seconds. Only a basic knowledge of SQL
is needed to query extremely large datasets
in an ad hoc manner. BigQuery is the public
implementation of Dremel that was recently
launched to general availability. BigQuery is a
web service that enables companies to analyze
massive datasets of up to billions of rows in
seconds using Google’s vast data processing
infrastructure. BigQuery is designed to finish
most queries within seconds or tens of seconds
and can even be used by non-programmers,
whereas MapReduce takes much longer (at
least minutes, and sometimes even hours or
days) to finish processing a dataset query.
Google also argued that using Big Query instead
of a Hadoop deployment will save users money,
because they only pay for the queries that are
processed, rather than pay for the computational
costs of running individual Hadoop supporting
components.
There is no use in just blindly adopting Hadoop
as it is the flavor of the time. For those people
wanting to get involved in Big Data and to future
proof their skills, learning about Dremel and
becoming a Big Query expert might be a good
investment for the next couple of years. If the
industry wakes up to Dremel, which it should,
then those candidates that invested their time
to learn about it, will find their skills in extreme
demand.
• 4K ultra-high-def video support
6 t h e B I T, t h e B u l l e t t i n o f I n f o r m a t i o n Te c h n o l o g y
F a c u l t y C o r n e r F a c u l t y C o r n e r
W h a t i s B i g D a t a ?
FELIX XAVIER ,TECHNICAL ASSISTANT
Small data is gone. Data is just going to get bigger and bigger and bigger, and people just
have to think differently about how they manage it -Scott Zucker
Big data is a popular term
used to describe the expo-
nential growth and avail-
ability of data, both struc-
tured and unstructured.
And big data may be as
important to business –
and society – as the In-
ternet has become. Why?
More data may lead to
more accurate analyses.
More accurate analyses
may lead to more confi-
dent decision making. And
better decisions can mean
greater operational effi-
ciencies, cost reductions
and reduced risk.
As far back as 2001, in-
dustry analyst Doug Laney
(currently with Gartner)
articulated the now main-
stream definition of big
data as the three Vs: vol-
ume, velocity and variety1:
• Small data is gone. Data
is just going to get bigger
and bigger and bigger,
and people just have to
think differently about how
they manage it.
—Scott Zucker
Family Dollar
Volume. Many factors con-
tribute to the increase in
data volume. Transaction-
based data stored through
the years. Unstructured
data streaming in from
social media. Increasing
amounts of sensor and
machine-to-machine data
being collected. In the
past, excessive data vol-
ume was a storage issue.
But with decreasing stor-
age costs, other issues
emerge, including how to
determine relevance with-
in large data volumes and
how to use analytics to
create value from relevant
data.
• Velocity. Data is stream-
ing in at unprecedented
speed and must be dealt
with in a timely manner.
RFID tags, sensors and
smart metering are driv-
ing the need to deal with
torrents of data in near-
real time. Reacting quickly
enough to deal with data
velocity is a challenge for
most organizations.
• Variety. Data today
comes in all types of for-
mats. Structured, numeric
data in traditional data-
bases. Information cre-
ated from line-of-business
applications. Unstructured
text documents, email,
video, audio, stock ticker
data and financial transac-
tions. Managing, merging
and governing different
varieties of data is some-
thing many organizations
still grapple with.
At SAS, we consider two
additional dimensions
when thinking about big
data:
• Variability. In addition
to the increasing veloci-
ties and varieties of data,
data flows can be highly
inconsistent with periodic
peaks. Is something trend-
ing in social media? Daily, seasonal
and event-triggered peak data loads
can be challenging to manage. Even
more so with unstructured data in-
volved.
• Complexity. Today’s data comes
from multiple sources. And it is
still an undertaking to link, match,
cleanse and transform data across
systems. However, it is necessary to
connect and correlate relationships,
hierarchies and multiple data link-
ages or your data can quickly spiral
out of control.
t h e B I T, t h e B u l l e t t i n o f I n f o r m a t i o n Te c h n o l o g y 7
F a c u l t y C o r n e r
dreds and thousands of models
at the product level – at the SKU
level – because you have the
big data and analytics to support
those models at that level.
—Kerem Tomak
Technologies
A number of recent technology ad-
vancements enable organizations
to make the most of big data and
big data analytics:
• Cheap, abundant storage.
• Faster processors.
• Affordable open source, distrib-
uted big data platforms, such as
Hadoop.
• Parallel processing, clustering,
MPP, virtualization, large grid en-
vironments, high connectivity and
high throughputs.
• Cloud computing and other flex-
ible resource allocation arrange-
ments.
The goal of all organizations with
access to large data collections
should be to harness the most rel-
evant data and use it for better de-
cision making.
The Importance of Big
Data and What You Can
Accomplish
The real issue is not that
you are acquiring large
amounts of data. It’s what
you do with the data that
counts. The hopeful vi-
sion is that organizations
will be able to take data
from any source, harness
relevant data and analyze
it to find answers that en-
able 1) cost reductions,
2) time reductions, 3) new
product development and
optimized offerings, and
4) smarter business deci-
sion making. For instance,
by combining big data and
high-powered analytics, it
is possible to:
• Determine root causes
of failures, issues and de-
fects in near-real time, po-
tentially saving billions of
dollars annually.
• Optimize routes for many
thousands of package de-
livery vehicles while they
are on the road.
• Analyze millions of SKUs
to determine prices that
maximize profit and clear
inventory.
• Generate retail coupons
at the point of sale based
on the customer’s current
and past purchases.
• Send tailored recommen-
dations to mobile devices
while customers are in the
right area to take advan-
tage of offers.
• Recalculate entire risk
portfolios in minutes.
• Quickly identify custom-
ers who matter the most.
• Use clickstream analysis
and data mining to detect
fraudulent behavior.
Challenges
Many organizations
are concerned that the
amount of amassed data
is becoming so large that
it is difficult to find the most
valuable pieces of infor-
mation.
• What if your data volume
gets so large and varied
you don’t know how to
deal with it?
• Do you store all your
data?
• Do you analyze it all?
• How can you find out
which data points are re-
ally important?
• How can you use it to
your best advantage?
Until recently, organiza-
tions have been limited to
using subsets of their data,
or they were constrained
to simplistic analyses be-
cause the sheer volumes
of data overwhelmed their
processing platforms. But,
what is the point of collect-
ing and storing terabytes
of data if you can’t analyze
it in full context, or if you
have to wait hours or days
to get results? On the oth-
er hand, not all business
questions are better an-
swered by bigger data
Now you can run hun-
8 t h e B I T, t h e B u l l e t t i n o f I n f o r m a t i o n Te c h n o l o g y
WE ARE AWASH in a flood of data today.
In a broad range of application areas,
data is being collected at unprecedented
scale. Decisions that previously were
based on guesswork, or on painstakingly
constructed models of reality, can now be made based on the
data itself. Such Big Data analysis now drives nearly every
aspect of our modern society, including mobile services, retail,
manufacturing, financial services, life sciences, and physical
sciences.
While the potential benefits of Big Data are real and
significant, and some initial successes have already
been achieved, there remain many technical challenges
that must be addressed to fully realize this potential.
The sheer size of the data, of course, is a major chal-
lenge, and is the one that is most easily recognized. But
the challenges not just confined to Volume, but also in
Variety and Velocity. By Variety, they usually mean het-
erogeneity of data types, representation, and semantic
interpretation. By Velocity, they mean both the rate at
which data arrive and the time in which it must be acted
upon. While these three are important, this short list
fails to include additional important requirements such
as privacy and usability.
The analysis of Big Data involves multiple distinct
phases as shown in the figure below, each of which
introduces challenges.
Phases in the Processing Pipeline
1. Data Acquisition and Recording
Most of the data is of no interest, and it can be filtered and com-
pressed by orders of magnitude. One challenge is to define
these filters in such a way that they do not discard useful
information.We need research in the science of data reduc-
tion that can intelligently process the raw data to a size that its
users can handle while not missing the needle in the haystack.
C h a l l e n g e s a n d O p p o r t u n i t i e s w i t h B i g D a t a
ARUN SOMAN, ASST. PROFESSOR, IT
F a c u l t y C o r n e r
t h e B I T, t h e B u l l e t t i n o f I n f o r m a t i o n Te c h n o l o g y 9
Furthermore, we require “on-line” analysis techniques that can
process such streaming data on the fly, since we cannot afford
to store first and reduce afterward.The second big challenge is
to automatically generate the right metadata to describe what
data is recorded and how it is recorded and measured.Another
important issue here is data provenance. Recording informa-
tion about the data at its birth is not useful unless this infor-
mation can be interpreted and carried along through the data
analysis pipeline.Thus we need research both into generat-
ing suitable meta-
data and into data
systems that carry
the provenance of
data and its meta-
data through data
analysis pipelines.
2. Information
Extraction and
Cleaning
Frequently, the information collected will not be in a format
ready for analysis.We cannot leave the data in this form and
still effectively analyze it. Rather we require an information
extraction process that pulls out the required information from
the underlying sources and expresses it in a structured form
suitable for analysis. Doing this correctly and completely is a
continuing technical challenge. Also, we are used to thinking
of Big Data as always telling us the truth, but this is actually
far from reality.
3. Data Integration, Aggregation, and Representation
Data analysis is considerably more challenging than simply
locating, identifying, understanding, and citing data. For effec-
tive large-scale analysis all of this has to happen in a com-
pletely automated manner. This requires differences in data
structure and semantics to be expressed in forms that are com-
puter understandable, and then “robotically” resolvable.Even
for simpler analyses that depend on only one data set, there
remains an important question of suitable database design.
Usually, there will be many alternative ways in which to store the
same information.We must enable other professionals, such as
domain scientists, to create effective database designs, either
through devising tools to assist them in the design process or
through forgoing the design process completely and develop-
ing techniques so that databases can be used effectively in the
absence of intelligent database design.
4. Query Processing, Data Modeling, and Analysis
Mining requires integrated, cleaned, trustworthy, and efficiently
accessible data, declarative query and mining interfaces, scal-
able mining algorithms, and big-data computing environments.
At the same time, data mining itself can also be used to help
improve the quality and trustworthiness of the data, under-
stand its semantics, and provide intelligent querying functions.
Big Data is also enabling the next generation of interactive
data analysis with real-time answers.Scaling complex query
processing techniques to terabytes while enabling interactive
response times
is a major open
research problem
today.A problem
w i t h c u r r e n t
Big Data anal-
ysis is the lack
of coordination
between data-
base systems,
which host the
data and provide
SQL querying, with analytics packages that perform various
forms of non-SQL processing, such as data mining and sta-
tistical analyses. Today’s analysts are impeded by a tedious
process of exporting data from the database, performing a
non-SQL process and bringing the data back.
5. Interpretation
Having the ability to analyze Big Data is of limited value if
users cannot understand the analysis. Ultimately, a decision-
maker, provided with the result of analysis, has to interpret
these results.Hence, one must provide supplementary infor-
mation that explains how each result was derived, and based
upon precisely what inputs. Such supplementary information
is called the provenance of the (result) data. By studying how
best to capture, store, and query provenance, in conjunction
with techniques to capture adequate metadata, we can create
an infrastructure to provide users with the ability both to inter-
pret analytical results obtained and to repeat the analysis with
different assumptions, parameters, or data sets.The users need
to be able to see not just the results, but also understand why
they are seeing those results.
Data analysis is considerably more challenging than
simply locating, identifying, understanding, and citing
data.
1 0 t h e B I T, t h e B u l l e t t i n o f I n f o r m a t i o n Te c h n o l o g y
M O B I L E S E C U R I T Y
NAJLA SIRAJ S1S2 H
MOBILE DEVICES CAN
be both the instru-
ments and victims
of privacy violations.
Google’s latest innova-
tion, Google Glass, has been pre-emp-
tively banned at a diner in Seattle due
to the security implications of an unobtru-
sive mobile device capable of discreetly
recording audio, video and still footage
in public and private places. However,
most security threats from mobile devices
result from the manner in which the con-
sumer uses the technology:
• Consumers who elect to set PINs
and passwords for their mobile devices
often choose easily deciphered codes,
such as 1234 or 0000.
• Users may unknowingly down-
load malware disguised as a useful
application.
• Out-of-date operating systems
may pose threats. OS manufacturers
periodically release security patches
and fixes, but it is up to the consumer to
update their devices. Older devices may
not support new updates to the OS.
• Out-of-date software presents
similar security risks. Attackers can exploit
vulnerabilities in outdated software.
• Wireless transmissions are not
always encrypted, making informa-
tion sent via mobile devices easier to
intercept.
With users treating their devices in such a
blasé fashion, it can be difficult and frus-
trating for IT specialists to help users
avoid security and privacy mishaps—
especially when those devices are used
for company purposes.
S t u d e n t C o r n e r
t h e B I T, t h e B u l l e t t i n o f I n f o r m a t i o n Te c h n o l o g y 1 1
e - S A P
SUSAN SHAJU ,S3 IT
ELECTRONIC SOLUTIONS AGAINST Agricultural
Pests (e-SAP), an innovative tablet device, prom-
ises to be of much help in the field of agriculture.
It helps people get instant information about the
kind of pest that’s ruining the plants, with sugges-
tions on how to fight it. The hand-held device is easy to use.
Even a layman can navigate to find the problem in his crop.
The device works on a web-based application system, which
facilitates flow of information from the grower to the farm sci-
entist at the click of a button. It also has a voice-based appli-
cation that guides the farmer in the local dialect. The tablet can
also recommend solutions based on geography and agro-cli-
matic conditions of the area where the crop is being grown.
e-SAP will make the work of the extension service worker
easy, enhance their efficiency and at the same time provide
the farmers with solutions right in his field in real time. This
technology targets one of the critical requirements of a crop
cycle, pest management. e-Sap has features that can bring
the farmer, extension worker, scientist and policy maker on the
same plane, thereby, helping to find solutions that are more
practical and in lesser time.
e-SAP is unique in the sense that it has a voice-based applica-
tion system, which guides the farmer and the extension worker
in the local language about how to collect the data and the
specimens. It also allows the extension worker and the farmer
to do a survey of the pest attack or related problems right in
the field, which is then automatically synthesised in the form of
graphs and tables along with the decision support intelligence.
Another highlight of e-SAP is the image - based model that
captures high quality images of pests and their symptoms and
then guides the user in identifying the pest. This coupled with
audio assistance makes it very easy to handle.
Just a click away: As soon as you notice a pest in your garden
or farm, capture images of the same using the device. e-SAP
captures high-quality images of pests and their symptoms, and
then guides the user on how to identify them. It also provides
instant solutions on how to get rid of them.
S t u d e n t C o r n e r
1 2 t h e B I T, t h e B u l l e t t i n o f I n f o r m a t i o n Te c h n o l o g y
Skinput is a technology that appropriates the
human body for acoustic transmission, allow-
ing the skin to be used as an input surface. In
particular, we resolve the location of finger taps
on the arm and hand by analyzing mechani-
cal vibrations that propagate through the body.
We collect these signals using a novel array of
sensors worn as an armband. This approach
provides an always available, naturally por-
table, and on-body finger input system. We
assess the capabilities, accuracy and limi-
tations of our technique through a two-part,
twenty-participant user study.
Introduction of Skinput Technology
The primary goal of Skinput is to provide an
always available mobile input system - that
is, an input system that does not require a
user to carry or pick up a device. A number
of alternative approaches have been pro-
posed that operate in this space. Techniques
based on computer vision are popular These,
however, are computationally expensive and
error prone in mobile scenarios (where, e.g.,
non-input optical flow is prevalent). Speech
input is a logical choice for always-available
input, but is limited in its precision in unpredict-
able acoustic environments, and suffers from
privacy and scalability issues in shared envi-
ronments. Other approaches have taken the
form of wearable computing.
This typically involves a physical input device
built in a form considered to be part of one’s
clothing. For example, glove-based input
systems allow users to retain most of their
natural hand movements, but are cumber-
some, uncomfortable, and disruptive to tactile
sensation. Post and Orth present a “smart
fabric” system that embeds sensors and con-
ductors into abric, but taking this approach to
always-available input necessitates embed-
ding technology in all clothing, which would
be prohibitively complex and expensive. The
SixthSense project proposes a mobile, always
available input/output capability by combin-
ing projected information with a color-marker-
based vision tracking system. This approach is
feasible, but suffers from serious occlusion and
accuracy limitations. For example, determining
whether, e.g., a finger has tapped a button, or
is merely hovering above it, is extraordinarily
difficult .
Bio-Sensing :
Skinput leverages the natural acoustic conduc-
tion properties of the human body to provide
an input system, and is thus related to previ-
ous work in the use of biological signals for
computer input. Signals traditionally used for
diagnostic medicine, such as heart rate and
skin resistance, have been appropriated for
assessing a user’s emotional state. These fea-
tures are generally subconsciously driven and
cannot be controlled with sufficient precision
for direct input. Similarly, brain sensing technol-
ogies such as electro encephalography (EEG)
& functional near-infrared spectroscopy (fNIR)
have been used by HCI researchers to assess
cognitive and emotional state; this work also
primarily looked at involuntary signals. In con-
trast, brain signals have been harnessed as
a direct input for use by paralyzed patients,
but direct brain computer interfaces (BCIs) still
lack the bandwidth requiredfor everyday com-
puting tasks, and require levels of focus, train-
ing, and concentration that are incompatible
with typical computer interaction.
There has been less work relating to the inter-
section of finger input and biological signals.
Researchers have harnessed the electri-
cal signals generated by muscle activa-
tion during normal hand movement through
electromyography (EMG). At present, however,
this approach typically requires expensive
amplification systems and the application of
conductive gel for effective signal acquisi-
tion, which would limit the acceptability of this
approach for most users. The input technol-
ogy most related to our own is that of Amento
who placed contact microphones on a user’s
wrist to assess finger movement. However,
this work was never formally evaluated, as
is constrained to finger motions in one hand.
The Hambone system employs a similar
setup, and through an HMM, yields classifica-
tion accuracies around 90% for four gestures
(e.g., raise heels, snap fingers). Performance
of false positive rejection remains untested in
both systems at present. Moreover, both tech-
niques required the placement of sensors near
the area of interaction (e.g., the wrist), increas-
ing the degree of invasiveness and visibility.
Finally, bone conduction microphones and
headphones - now common consumer tech-
nologies - represent an additional bio-sens-
ing technology that is relevant to the present
work. These leverage the fact that sound fre-
quencies relevant to human speech propagate
well through bone.
Bone conduction microphones are typically
worn near the ear, where they can sense vibra-
tions propagating from the mouth and larynx
during speech. Bone conduction headphones
send sound through the bones of the skull and
jaw directly to the inner ear, bypassing trans-
mission of sound through the air and outer ear,
leaving an unobstructed path for environmen-
tal sounds .
S K I N P U T T E C H N O L O G Y
POORNIMA P, S7 IT
S t u d e n t C o r n e r
t h e B I T, t h e B u l l e t t i n o f I n f o r m a t i o n Te c h n o l o g y 1 3
There has been a torrent of major revelations from former
National Security Agency contractor Edward Snowden showing
how vulnerable we are to mass Internet surveillance. The NSA
has found it easy to tap the internet according to Snowden and
in response IETF (Internet Engineering Task Force), is ponder-
ing how to strengthen the internet.
According to cryptographer Bruce Schneier, using encryption in
various parts of the existing internet can go a long way in pre-
venting agencies like NSA from snooping data, thus making it
harder to spy. One simple step, for example, is for Web compa-
nies to routinely use SSL, an encrypted communications proto-
col between people’s computers and company servers. NSA got
ten times as much information from Yahoo users than it did from
Google users, and that this was because “Google uses SSL by
default.”
The engineers, the IETF, are now making efforts to produce a
revamped version of the current system whereby all Web traffic
will be encrypted. This is expected to be ready by the end of
next year.
http short for Hyper Text Transfer Protocol is the underlying proto-
col used by the World Wide Web. It defines information exchanges
between web browsers on your phone/computer and web servers
which hold data of the website you are visiting. Today much of
C O M I N G S O O N . . . A F U L LY E N C RY P T E D I N T E R N E T
SREEJA SURESH SHENOY, S8-IT
the web traffic is unencrypted unless https- a secure version
of http is used. https is commonly used by banks, e-commerce
sites, google, facebook, etc. IETF would like to make encryp-
tion a default part of http creating new version http 2.0. In fact
work on this is proceeding “frantically” according to Stephen
Farrell, a computer scientist at Trinity College in Dublin who
is part of the project. This new specification is expected to get
ready by the end of 2014. But it is up to the websites to decide
to adopt this technology. According to experts, this would make
it harder for agencies like NSA.
Other steps taken by the IETF include stepping up security in
e-mails and instant message traffic. Right now protocols do
exist to encrypt these communications. But the problem is that
the encryption is not set correctly. As a result, they don’t work
between different mail servers.eg. when they hop between big
encrypted services like Gmail and that of a small company or
institution. When this happens, your e-mail ends up being sent
“in the clear” ie, unencrypted because e-mail protocols give
importance to actual delivery over all other concerns, includ-
ing whether or not the encryption actually was working.
But there are some challenges to be faced if encryption is
added by default to http. Part of what makes the task hard
involves the static portion of Web pages that are “cached,”
or stored on local servers nearer to the user. Caching is a
problem because it is unencrypted content that sits between
the browser and the server. If encryption is added, caching
will be harder. There should be a trade-off between the secu-
rity benefit and the caching benefit which is being worked on
by the engineers. At the end of the day, it’s both a good first
step and a promising reaction by those that build this wonder-
ful resource called the Internet.
S t u d e n t C o r n e r1 4 t h e B I T, t h e B u l l e t t i n o f I n f o r m a t i o n Te c h n o l o g y
Tongue Drive System is a new revolutionary system to help
individuals with disabilities to control wheelchairs, computers
and other devices simply by using their tongue. This technol-
ogy will be helpful to individuals with serious disabilities, such
as those with severe spinal cord injuries and will allow them to
lead more active and independent lives.
Individuals using a tongue-based system should only be able
to move their tongue, which is especially important if a person
has paralyzed limbs. With Tongue Drive System ,it is possible
to successfully substitute for mouse inorder to access com-
puter. User can move the cursor on the screen using their
tongue motions. They can also issue a single or double-click
for selecting icons or opening folders. A tiny magnet, only a size
of a grain of rice, is attached to an individual’s tongue using
implantation, piercing or adhesive. This technology allows a dis-
abled person to use tongue when moving a computer mouse
or a powered wheelchair. The tongue is chosen to control the
system because unlike the feet and the hands, which are con-
nected by brain through spinal cord, the tongue and the brain
has a direct connection through cranial nerve. In case when a
person has a severe spinal cord injure or other damage, the
tongue will remain mobile to activate the system. Tongue move-
ments are also fast, accurate and do not require much think-
ing, concentration or effort.
The motions of the magnet attached to the tongue are spotted
by a number of magnetic field sensors installed on a headset
worn outside or an orthodontic brace inside the mouth. The
signals coming from the sensors are wirelessly sent to a portable
computer that placed on a wheelchair or attached to an indi-
vidual’s clothing.
The Tongue system is developed to recognize a wide array of
tongue movements and to apply specific movements to certain
commands, taking into account user’s oral anatomy, abilities
and lifestyle. The ability to train system with as many com-
mands as an individual can comfortably remember is a signifi-
cant advantage over the common sip-n-puff device that acts as
a simple switch controlled by sucking or blowing through a straw.
The Tongue Drive system is touch-free, wireless and non-inva-
sive technology that needs no surgery for its operation.
Several GUIs have been developed for the prototype Tongue
Drive System. One of them is a simple computer game ,called
Fish Tales. Accessible entertainment are even more important
in improving the quality of life for individuals with disabilities
than their healthy counter parts. This experiments evaluates
the usability of the Tongue Drive System in enabling users to
play computer games ,which are normally controlled by key-
board mouse. In this GUI, players use their tongues to navi-
gate a red fish to catch the smaller fish, while avoiding being
caught by the bigger fish. Moving the tongue in a certain direc-
tion, move the mouse cursor on the screen, resulting in the red
fish swimming in that direction. The further the cursor is moved
from the current location of the fish swims. The goal is to catch
as many smaller fish as possible. When the subject’s fish eat
enough smaller fish, it grows and can eat larger fish.
T O N G U E D R I V E S Y S T E M T O O P E R AT E C O M P U T E R S
MANJU V. J, S7.IT
S t u d e n t C o r n e r t h e B I T, t h e B u l l e t t i n o f I n f o r m a t i o n Te c h n o l o g y 1 5
E ve n t s @ I T D e p a r t m e n t
• Conducted a Workshop on Internet working Lab and Computer Aided Software engineering Lab from July 5th to July 6th , 2013
for faculty members from Engineering Colleges affiliated to M.G University.
• Conducted a Workshop on “Deployment of Computer Clusters” on September 9th ,2013
• Conducted a Workshop on cloud Computing on ANEKHA Platform with Computer Science Department on October 19th ,2013.
• Conducted a “Two day faculty development program on Enhanced Teaching Methods Using Moodle” by Binu A for RSET faculty
members on 10th and 16th December ,2013.
1 6 t h e B I T, t h e B u l l e t t i n o f I n f o r m a t i o n Te c h n o l o g y