Home Daily Archives
Daily Archives

May 8, 2017

The following basic terminologies are considered the top 25 terms all computer students should know before they even begin their studies:

1. Bit: Binary data storage unit valued at either 1 or 0.

2. Byte: Eight data bits valued between zero and 255.

3. Word: Two data bytes or 16 data bits valued between zero and 16,535.

4. CD-ROM: A storage disk with approximately 640 megabytes of capacity.

5. CD-ROM Drive: Hardware used for reading and writing to CD-ROMs.

6. Storage Media: Magnetic devices that permanently store computer data.

7. File: Permanent storage structure for data kept on a hard drive or other permanent place.

8. Virus: Unauthorized programs that infect files or send themselves via email.

9. Vulnerability: When unauthorized access can be gained due to software errors.

10. Security Flaw: When attackers gain unauthorized system access due to a software bug.

11. Worm: Unwanted programs accessing computers via application/system vulnerabilities.

12. Hardware: Physical parts of computer (case, disk drive, monitor, microprocessor, etc.).

13. Software: Programs that run on a computer system.

14. Firmware: Software that has been permanently written into a computer.

15. ISP: Internet Service Provider.

16. BIOS: The basic input/output system computers use to interface with devices.

17. MIME: Multipurpose Internet Mail Extension.

18. Boot: What happens when a computer is turned on and beginning to run.

19. Crash: When computer software errors occur and programs fail to respond.

20. Driver: Program that understands interfaced devices like printers and video cards.

21. Network: Cables and other electrical components carrying data between computers.

22. Operating System: A computer’s core software component.

23. Parallel: Sending data over more than one line simultaneously.

24. Serial: Sending data over a single line one bit at a time.

25. Protocols: Communication methods and other standard Internet/networking functions.

These are the top 25 terms all computer students should know before they even begin their technical training. Most computer students know much more. In fact, everyone who uses a computer these days should understand these terms so they can be better informed about the important tool that is so integral to our daily lives.

Source by Erik R Johnson

May 8, 2017 0 comment
0 Facebook Twitter Google + Pinterest

A signal is a piece of information in binary or digital form. Digital Signal Processing techniques improve signal quality or extract important information by removing unwanted parts of the signal.

The introduction of general purpose microprocessors in the late 1970’s and early 1980’s made it possible for DSP techniques to be used in a much wider range of applications. The microprocessors such as the Intel x86 family were not suitable for the numerically-intensive requirements of digital signal processing, and the increasing importance of DSP led major electronic manufacturers to develop DSP chips, the design of which met all the requirements of digital signal processing.

DSP is a programmable chip and is capable of carrying out millions of floating point operations per second. Typical DSP application fields are audio signal processing, video signal processing, image processing and telecommunications devices. DSP is the basis of many technologies including mobile phones, personal computers, video recorders, CD players, hard disc drive controllers and modems.

The application of digital signal processors in cellular phones is very significant. Signal compression, an important application of DSPs, is used in cellular phones to permit a larger number of calls to be handled simultaneously within each local “cell”. The signal compression technology helps to communicate to one another by seeing them while talking. This facility is available with the help of a computer monitor, small video camera and a conventional telephone line linking them together. DSPs can be used in applications that require a high computational speed. Such applications include computer video boards and specialized co-processor boards designed for intensive scientific computation.

Source by Anitha Hari

May 8, 2017 0 comment
0 Facebook Twitter Google + Pinterest

The Periodic Table is an arrangement of chemical elements which are sorted in order by increasing atomic numbers in vertical columns referred to as groups, and horizontal rows known as periods. It is a resourceful tool in chemistry and other branches of science. Dmitri Ivanovich Mendeleev, a Russian chemist, founded the Period Law and the foundation for the Periodic Table. The Periodic Law states that elements are in order of increasing mass. Henry Mosley, an English physicist, organized the elements according to increasing atomic number, which is used in the modern Periodic Table. There are currently 118 elements on the Periodic Table, 90 of which are found in nature, and the rest were developed by humanity.

Chemical elements are substances which cannot be chemically broken down into smaller substances. The elements of the periodic table are distinguished by their atomic numbers, which are the number of protons in the nuclei within the atoms of the elements. The number of electrons in an atom are equivalent to the number of protons. There are three visible components to each element on the Periodic Table, which are the atomic number, the element’s symbol, and its atomic mass. The atomic number is the number of protons in an element’s atoms, the symbol is consisted of one or two letters which represent the element, and the atomic mass is the combination of the masses of protons and neutrons. For example, the element Carbon’s symbol is C, its atomic number is six, and its atomic number is twelve.

The elements are broken up into several groups within the Periodic Table. A majority of the elements are metals, which are narrowed down to alkali metals, alkaline earth metals, transition metals, and inner transition metals. The last column to the very right of the table are known as inert gases. The column before inert gases are halogens. Like inert gases and halogens, the four groups before these two columns are also non-metals. The first two groups, beginning with Hydrogen and Beryllium, are 1A and 2A; the groups beginning with Boron, Carbon, Nitrogen, Oxygen, Fluorine and Helium are 3A through 8A. The metals in between 1A through 8A are 1B through 10B.

The Periodic Table of elements is essential in chemistry because it contains fundamental information which gives chemists the ability to distinguish an element’s properties due to the table’s format. With this data, chemists can predict chemical reactions because of the different groups (metals, inert gases, metalloids, etc.), and they can determine the associations of different elements with each other.

Source by Alliyah Beltran

May 8, 2017 0 comment
0 Facebook Twitter Google + Pinterest

Learning all you can about lead generation is what will set you apart from the competition. Knowledge is power, especially when it comes to the field of sales. Are you prepared to become the best of the best? If so, the article below is just waiting for you to read it.

Giving an incentive to purchase is a great way to succeed at generating quality leads, because a lot of people will get on board just for the incentive. If someone needs what you’re selling, incentive can cause them to buy from you instead of the competition. Provide an additional incentive to choose you, and your lead base will explode.

Generating quality leads will be improved by establishing yourself as a trustworthy provider. Do not use those “screaming” ads or do anything cheesy that incorporates too much hype. Use facts and a rational speaking voice. Always be up front with others, and you will secure a loyal fan base as a result.

Survey your current customers about where they typically congregate online. To generate quality leads, you need to understand where your audience hangs out. Once you know, get involved in that community any way you can. That may mean advertising or it may mean becoming a thought leader in the community.

Check out local events in order to maximize your leads. If you’re allowed to have a table there, you could hand out pamphlets and hold a giveaway. Just ask people to leave their name and email in return for a ballot, but be sure to let them know if you’ll be adding them to a mailing list.

If you have not been tapping into the power of social media enough, then it’s time to expand your efforts. There are cost efficient social media campaigns you can run on the most popular sites, and ways to really make content go viral. All of your customer base is there to help you share what you can do for your new customers.

Talking with businesses in the area that are similar to yours can be very helpful. Landscapers might want to talk about growing a vegetable garden. Personal trainers could offer advice on how people can still be fit while they work full time. Let your skills teach others and earn from it.

One of the highest visitor to lead ratios you will find online is with LinkedIn. Therefore, this platform should be high on your list of tools for lead generation. Put together a polished and professional profile that includes links to all your landing pages and make LinkedIn a valuable part of your lead generation success.

You have competitors, but you also have companies within your industry that are complement your business. Therefore, network with these companies so that you can exchange leads. This can be a great method to help gain new customers and strengthen your business niche in general for repeated business later on.

With all of these facts fresh in your mind, the time is now to start planning your new strategy. Get to work today so that you can ensure your success tomorrow. The sooner you get down to business, the faster your goals will be reached and your income will skyrocket.

Source by Kurt A Tasche

May 8, 2017 0 comment
0 Facebook Twitter Google + Pinterest

“The state of anomie is impossible wherever organs solidly linked to one another are in sufficient contact, and in sufficiently lengthy contact. Indeed, being adjacent to one another, they are easily alerted in every situation to the need for one another and consequently they experience a keen, continuous feeling of their mutual dependence.”

(Durkheim, The Division of Labor in Society, 304)

Emile Durkheim theorized the concept of anomie in his studies, The Division of Labor in Society and Suicide. Durkheim defined the term anomie as a condition where social and/or moral norms are confused, unclear, or simply not present. Durkheim felt that this lack of norms led to deviant behavior. Durkheim argued that sudden changes in society make formerly satisfactory norms obsolete. Under the strain of rapid change, social rules fail to keep pace with attitudes and expectations. Inappropriate rules result in contempt for all rules. Intense frustration and equally intense anxiety develop as men seek fulfillment. Dissatisfaction spreads through society and produces a general state of anomie: lack of clarity, ruthlessness, and personal disorientation.

Robert K. Merton extended Durkheim’s ideas by showing that individuals intensify their anomie when they abandon their norms to satisfy their unleashed desires. Merton theorizes that anomie (normative breakdown) and some forms of deviant behavior derive largely from a disjunction between “culturally prescribed aspirations” of a society and “socially structured avenues for realizing those aspirations.” For example, a once law-abiding businessman who resorts to arson to eliminate a more efficient competitor has begun to sever his connections with other members of society, thus increasing his anxiety and isolation.

Karl Marx, writing in the 1840s, described social alienation while developing the philosophy of communism. He believed that the evils of wage labor separated men from other men and eventually from themselves. Cash exchange causes this dehumanization, he argued, because it reduces men to the level of interchangeable objects. As Marx describes, “The possessing class and the proletarian class represent one and the same human self-alienation. But the former feels satisfied and affirmed in this self-alienation, experiences the alienation as a sign of its own power, and possesses in it the appearance of a human existence. The latter, however, feels destroyed in this alienation, seeing in it its own impotence and the reality of an inhuman existence. Ultimately, this transformation leads to viewing man and nature as nothing more than things to manipulate. The result of this outlook is the psychological pain of total isolation from others and the natural self.

As we can see, the interpretation of Durkheim’s anomie is similar to both Merton’s interpretation of anomie and Marx’s interpretation of alienation. We find similarities in aspects of isolation and disorientation between Durkheim and Marx. Although dissimilarly Marx’s alienation deals with money and its role in a proletariat’s lifestyle and how it keeps the ruling class up, and everyone else down. Durkheim’s anomie more deals with the attitudes and expectations of the society, people resisting healthy and normal lifestyles, rather than being forced into that situation like in alienation.

Source by Richard Perry

May 8, 2017 0 comment
0 Facebook Twitter Google + Pinterest

Games are very important for people, since the Ancient World, contributing to their entertainment. Greeks invented the first games in antiquity and they are those who imagined the Olympic Games, too. Those games were held in honor of living persons, of the gods, some as offerings of thanksgiving.

With the passing of time, people created a lot of games. Nowadays, for example, they are playing on the computer killing monsters and trying to conquer land in order to build towns and capitals.

A very prevalent game continues to be “Hangman”. It was invented sometime during the Victorian era. It was mentioned for the first time in Alice Bertha Gomme’s “Traditional Games” in 1894 and it was named “Birds, beasts and fishes”. Later, in other sources, the game was called “Gallows”, “The Game of Hangin” or “Hanger”.

Like any other game it has rules but they are very simple. A player writes down the first and last letters for an object, animal or plant, and the other player needs to guess the other letters in between. The word to guess is represented by a row of dashes, giving the number of letters. If the guessing player suggests a letter which occurs in the word, the other player writes it in all its correct positions. If the suggested letter does not occur in the word, the other player draws one element of the hangman diagram.

The game is over when:

o The guessing player completes the word, or guesses the whole word correctly

o The other player completes the diagram.

We all know this game because we were used to playing it when we were kids and it is a word game that improves thinking and enriches vocabulary. Its rules are very simple and it is a challenge for each of us to check up our knowledge in a given language.

Source by Petre An

May 8, 2017 0 comment
0 Facebook Twitter Google + Pinterest

When we consider so many items in our lives, many people feel the common adage, “If it ain’t broke, don’t fix it.” In technology, this can be detrimental to our business and personal lives if we do not pay close attention to the risks associated with taking such a stance. Attempting to utilize out of date technology can be a money saver on the surface, but more often, it is a money trap waiting to spring from both a capital expense perspective, and an operating expense perspective.

Years ago, I was working to identify a series of systems and determine use and necessary upgrades for those systems. In my work, I came across several old, older, and oldest systems that were in use and identified one, an older AS400 that was over 15 years old. The system was a core to about 400 individual and was critical to do their work, and each person that I discussed the system with promptly told me two things: They could not work without the system, and it was OK because they paid support for that system.

Each person involved was adamant that we could not touch that system because “They were special”, “It could not be down”, and “They had support so we did not need to worry about it.”

As my team reviewed the system, I sat down with the system owner and called the vendor. They had been paying an excessive amount of money each year for support and I asked the vendor a simple question. “If the system goes down with a hardware failure, will you guarantee it will be repaired?” There was a pause, and then the answer came back. “Our SLA is we will have a technician on site within 4 hours.” I smiled, waited, and asked the question differently, “Can you guarantee you will be able to bring the system back online”, and the answer came back again, “Our SLA is we will have a technician on site within 4 hours.” We had some additional discussions but after the call I looked at the system owner, a non-technical person in charge of a major area and asked if they understood what had just happened, they were very thoughtful, and simply said. “I think we need to look at some additional options.”

We replaced that system with a newer box and worked towards replacement of the software. By utilizing virtual techniques we moved the system to a more resilient platform, ensuring the system would be online as necessary, and ensuring the solution would not be a tech onsite within 4 hours, but instead a system supporting 400 workers that would be online even in the event of a disaster.

So why did we make a good decision? It is easy. First, if the entity had gone down for even 1 hour, the 400 workers effected would cost an excessive dollar amount. Even if it is a minimum job at $10 an hour, which it was not, that is $4000 dollars an hour. If an outage was experienced it could have become a massive dollar amount in operating expenses in time lost that overshadows any other cost. Second, if the data had been lost, there would not have been alternate operating systems or hardware to bring the system back online and the cost of losing the data could be immeasurable. Third, the system itself, being out of date for so long had numerous security issues and could easily have been a breach of data that is protected by regulation. This alone can destroy both credibility of a business and business finances with minimal opportunity for recovery. Fourth, the system itself was impacting users and becoming less and less usable, causing actual workers to find workaround to do their work that was even more costly.

Of course there were many more reasons, but how does this matter to small and large businesses alike? Well, as the age of a system goes up, we add risk to that system and potential points of failure including replacement issues. The bigger the system, meaning the more moving parts, the more possible it is to run into issues as the systems can be affected more easily and impact users more easily.

A simple approach can be HardwareAge+OSAge+Risk+userimpact+financialimpact-DR resilience<10.

Why?

Well, as hardware ages, it requires updates but also may require replacement parts. As the parts become less available, the risk to the system is difficult and can be frustrating. If you virtualize you should consider the virtual strategy to be part of the same equation, but in the case of the system, your hardware age is always 1 as the virtual system then becomes the necessary upgrade.

The operating system can become a nightmare as its age goes up since it will develop more and more security risks. If it is end of life and not being supported anymore, you are instantly at major risk and need to find a solution. We often forget the Operating system and it is the source of much of what we do, and in most programs the foundation for doing work at all.

Risk can be a massive discussion on its own, but in this case let us consider risk as regulatory or agency risk as the whole equation is about risk indirectly. So consider risk from 0-5 where five is the greatest controlled items and regulatory work, like HIPAA, and zero is no risk at all.

For user impact and financial impact this is subjective but rate the impact from 0-3 where 0 is no impact at all, and 3 is high impact.

Disaster resilience can subtract from your score by creating situations where you can be back online quickly without as much risk of downtime. This can be achieved through programs that bring your system back online quickly. Using a virtual machine and a solution like Datto can get you back online quickly even in the event of a total loss, creating a lowering of overall risk.

This is not a hard rule and it is something I worked out to explain to people the risks associated with systems in a simple manner. A good technology professional would look at this and say it is a start, but there is a lot more to it, but this will let you know where to start. If you come up with a number greater than 10, it is definitely time to start talking to someone. If we take the example, we had previously. We get these numbers:

15+16+5+3+3-1=42

Every increment beyond 10 should have been a red flag, and in this case the DR resilience could have been 1, and still it would have been bad.

This is still just a guess. It is just as valid to measure supportability and availability with no equation. As the number of people who can support a system dwindles, it builds risk quickly whether the system is high risk or not. Multiple times, I have been put in the position of finding a way into a system that no one knows a password to and no one knows how to repair. If your support is single threaded, it is time to replace the software, the hardware or both.

It is also important to look closely at what vendors say to you. Obviously, there is no guarantee on any system but when you are not given an ETA or an escalation path in case of an outage, you are skirting with downtime, and potential costs associated with such.

Remember, if a system is not critical, will cost no time, will not be missed, has no critical or useful data on it, and can be gone forever with no impact on you or your business, then maybe it is OK to keep a very old system. I am sure there are some exceptions as well, where a piece of software would cost a lot to upgrade and the upgrade is avoided, but in the end if you have these systems and say, “If it ain’t broke, don’t fix it” maybe those machines should be shut down anyway and new solutions found to really help business.

Source by Andrew Allen Smith

May 8, 2017 0 comment
0 Facebook Twitter Google + Pinterest

In the beginning (1966), there was ELIZA – she was the first bot of her kind, had roughly 200 lines of code and was extremely smart. But you probably don’t know her. Later, came PARRY who was smarter than ELIZA (and could imitate a paranoid schizophrenic patient). But you probably don’t know PARRY either. Or ALICE (1995) or JABBERWACKY (2005). But you do know Siri! And that right there is brilliant marketing.

The bots have existed for a long time now but they weren’t always popular until Apple. Always one step ahead of its competition, Apple not only introduced the services of a chatbot but also used it to create a unique brand image. It killed two birds with one metaphoric stone known as Siri. There was no going back from there. Siri was/is a household name. She can read stories, predict the weather, give extremely witty answers just like a human would, and in one instance, Siri is also known to have dialed 911 and save a life.

Why marketing with the bots is a good idea

Although in its early days, chatbots are changing the way brands communicate and thereby, market themselves. For starters, individuals are bogged down by a million apps that clutter their digital space. Where apps and websites have failed, the bots are succeeding. It performs relevant functions such as addressing queries, providing customer support, offering suggestions, and moreover secure messaging platforms that are frequented by customers. Facebook’s Messenger with over 800 million users is one such example. If Microsoft’s CEO, Satya Nadella’s words are anything to go by, chatbots are the next big thing.

Chatbots are also replacing traditional marketing methods with personal conversations, laced with subtle upsells. Take Tacobot for instance – Taco Bell’s latest bot. The next time someone wants to order tacos, Tacobot here is going to list out the menu and let the user know if a one-plus-one offer is going on. It will also suggest add-ons like fried beans and salsa. If the user agrees and places an order, the bot has just made an improved sale without resorting to pushy, sales tactics. That’s bot as a customer service for you; a very effective one at that. Another benefit: chatbots are smart cookies. They scan internet cookies and track predictive analytics to provide suggestions based on past searches and purchases. Much of the time, it’s pretty effective.

Today, all major brands have developed chatbots. Amazon has Echo that allows users to order a pizza or buy a pen while Microsoft’s Cortana is always ready to answer queries. Bots have this brilliant quality of being human-like and logical at the same time, less the human complication. That sounds like the perfect relation every brand should have with its customer, and the bot can help you get there. After all, it’s a marketing pro.

Source by Ankeet Sarngin Kanungo

May 8, 2017 0 comment
0 Facebook Twitter Google + Pinterest

First, let’s take a look at how Young’s double slit experiment is to be carry out.

We’ve a monochromatic light source on the left, through the first slit and diffraction, the light waves propagates to the second slit, where the double slit is. Further diffraction on the double slit produces 2 coherent light wave, where the light waves will undergo interference with each other, where constructive interference produces light fringes on the screen (marked with n=0, 1, 2…) while destructive interference produces dark fringes.

The necessary condition for constructive interference to occur, is having the 2 coherent light source coincides in phase. When 2 light waves are in phase, their phase difference = nλ, where λ is the wavelength, n is a constant (0,1,2…)

From the figure above, the difference of light paths traveled by 2 light waves can be determined from the red triangle drawn, where it’s equal to asinθ.

So mathematically, asinθn = nλ.

Also from the triangle, when θ is sufficiently small and x is much lesser than D, we can approximate that sinθ = x/D.

Bringing the equations together, we have,

sinθn = x/D = nλ/a

Rearranging the equation, we have,

Distance between n-th bright fringe with 0th bright fringe, = nλD/a OR,

Successive distance between bright fringes, x = λD/a

where λ = wavelength of light source

D = distance between double slit and screen

a = distance between double slit

Similarly for dark fringe, destructive interference occurs when the 2 coherent light wave are in anti-phase, i.e. phase difference = (n+1/2)λ. Using similar technique, we get

Distance between n-th dark fringe with center = (n+1/2)λD/a

Source by Cress Chronox

May 8, 2017 0 comment
0 Facebook Twitter Google + Pinterest

Call centers are always looking for new strategies and areas to improve their performance and deliver higher level of customer satisfaction to their clients. Listed below are a few tips that would help you in optimizing your offshore call center services so that you can achieve better results from it.

Analyze Things, Record them and Measure Them: You need to analyze each and every thing happening at your contact center. There are numerous calls that are received and made by your agents on a daily basis. You need to keep a record of all these calls across various criteria. Information such as time of call, the duration of call, nature of call etc should be recorded. This recorded and well compiled data should be used to measure the overall performance of the contact center across various segments. The weak spots should be identified and worked upon to improve the overall performance.

Categorize Calls: There are many types of calls that would be handled by your agents on a daily basis. Categorization of calls would make it easier for you to store them in a meaningful manner so that you are able to refer to them in the future. Going back to the same records after a few months and searching for specific information would be very tough without proper categorization of calls. However, there would always be an uncategorized category which might make the searching process a little tough. It is suggested that you leave proper comments and notes along with each entry in this category so that searching for specific information in this category would become easier.

Route Calls to Most Suitable Agents: There are many good call routing solutions available in the market. It is suggested that you use a proper call routing system that could transfer calls to agents according to the type of call. Different agents have different specializations therefore they would be able to handle a particular set of calls more efficiently than others. Similarly, there are various other factors that should be considered before forwarding a call to any agent. Keep your requirements in mind while implementing a call routing solution at your center.

The Training: Without well trained agents you can’t deliver proper services that could keep the customers satisfied. Agents should have proper knowledge of the process and relevant experience in the industry. The most important thing is that they should be trained according to the most cutting edge strategies.

A highly efficient team of agents is must for delivering any call center service whether it includes phone answering services or offshore technical support services or any other service.

Source by Gloria Randell

May 8, 2017 0 comment
0 Facebook Twitter Google + Pinterest

When you think of a business card, you probably imagine a professional looking glossy card with the standard business details like company logo, persons name, job title and their phone number.

Very bland, very boring, and the only people who will look at it twice, are those who just needed your phone number written down because they’d already been sold on using your services.

I would argue that the whole idea of a “Business Card” is to help get business… yet something has gone wrong along the way. They’ve turned into “contact cards” that certainly don’t work at getting any business.

You’ll see these exact same cards being handed out by the dozen at trade shows, seminars and any type of business congregation… with the hope that the receiver will be interested and will contact the giver for more information.

But why would they?

Why would anyone who has no idea who you are contact you because your card says that your name is “John Smith” and you’re a “Business Consultant” or whatever?…

AnswerThey wouldn’t!

Instead of this ‘image rich, content poor’ business card, create an ‘image poor, content rich’ one.

You basically want to use all the real estate of the business card to create a “Direct Response” style card.

Here’s what it’ll need

  1. Plain white background with black text
  2. Big, attention grabbing headline
  3. Explain what you can do for the client
  4. Make an irresistible offer
  5. Create a “call to action”
  6. Use scarcity to create immediate action

Using these basic direct response strategies ensures that you’ll get a much higher response than you would with the standard business card layout.

Plus, hardly anybody does this, so your cards will stand out from the crowd. While everyone else is trying to add more colour, fancy fonts and pointless images, your plain looking direct response business card will get the attention.

Don’t use images unless they add to the message… and cram as much text in as you need. Don’t be afraid to make it small, because if people care about what you’re saying, they’ll squint to read it if they have to!

TIP: Instead of leaving the back of your business cards blank, spend a little extra and use the back for testimonials. Add at least one, along with photos of each testimonial giver.

Business cards are so cheap to create these days, and if you can have them working as their own sales force, why wouldn’t you design yours immediately and start to notice the difference this week!

Source by Dane Bergen

May 8, 2017 0 comment
0 Facebook Twitter Google + Pinterest
Newer Posts
0