Thursday, November 15, 2018

Does the Selection of Data Structures and Algorithms Really Matter?

YES.  Sorry, I didn’t mean to scare you there, but yes, data structure and algorithm selection matters in the design and development of efficient programs.  Shaffer (2013) addresses the skepticism facing this matter considering the fact that computing power has been increasing at a rapid rate.  So wouldn’t the increase in computing power negate the need to optimize software through the use of appropriate data structures and algorithms?  No.  Increased computing power combined with optimized code allows for solving problems of even greater complexity than we might have previously imagined.

SO, what are data structures and algorithms?  A data structure is a way of organizing data, and includes associated operations (Shaffer, 2013).  Examples of data structures include arrays, lists, stacks, queues, trees, and hash tables, as well as others.  Algorithms are a series of steps that combined, carry out a specific task, such as sorting a list (Lysecky, 2015).

BEFORE selecting a data structure and/or algorithm, it is important for programmers to analyze the goals and requirements of the project.  According to Shaffer (2013), there are three main steps to achieve this:
     1) Analyze your problem to determine the basic operations that
          must be supported.  Examples of basic operations include
          inserting a data item into the data structure, deleting a data
          item from the data structure, and finding a specified data
          item.
     2) Quantify the resource constraints for each operation.
     3) Select the data structure that best meets these
          requirements.
Chances are that one data structure or algorithm is not going to be best in all circumstances.  Otherwise, data structures and algorithms that remain in use today would have already become obsolete.  As Shafer (2013) states, “Each data structure and each algorithm has costs and benefits.”

WHEN considering appropriate data structures and algorithms for a project, programmers require a way to predict run-time, as well as to compare one data structure or algorithm to another.  This is often achieved by evaluating time and space complexities.  Time complexity is the amount of time required for an algorithm to run, while space complexity is the amount of memory required for execution.  Both of these metrics are commonly expressed through the use of Big O notation, which is a function that describes how time and space complexity are affected by the number of elements (University of Texas, n.d.).

ONE example of how input size affects time complexity is described in Complexity Analysis (n.d.).  In the scenario presented, the time complexity of a data set using a merge sort algorithm is compared to that of the same data set using a selection sort algorithm.  When implementing the two algorithms on a small data set, the selection sort algorithm is more efficient.  However, as n (the number of elements) becomes very large, merge sort is a much better alternative.  In fact, with a data set containing 10,000,000 elements, the selection sort requires 100 trillion accesses, or about 11.5 days to complete.  On the other hand, the same data set run with a merge sort algorithm requires 1.2 billion accesses, for a total sorting time of 12 seconds!  Clearly, algorithm selection in this scenario has large implications.

SIMILARLY, the benefit of selecting the correct data structure is illustrated in an example provided by Shaffer (2013).  Shaffer describes a scenario in which a programmer needs to select a data structure that organizes account and transactional information for a bank.  Since opening new accounts does not necessitate a very expedient process (clients are willing to invest time to open a new account), but withdrawing or depositing money from an ATM does, these are considerations that need to be made when selecting a data structure.  In addition, closing accounts do not necessitate expediency, since most clients are not concerned if it takes 24 hours, as long as they can withdraw the money first.  In this scenario, Shaffer recommends a hash table, since the ability to search is of high priority (to make deposits and withdrawals), insertion (opening a new account) is medium priority, and deletion (closing an account) has low priority.

THEREFORE, data structure and algorithm selection does matter, because the strategic design of applications can reduce time and space complexity, and may even improve the user experience.  As Shaffer (2013) states, “using the proper data structure can make the difference between a program running in a few seconds and one requiring many days.”

Resources

Lysecky, R., Vahid, F., Lysecky, S., & Givargis, T. (2015). Data structures essentials. Retrieved from https://zybooks.zyante.com/#/zybook/DataStructuresEssentialsR25/chapter/1/section/3

Shaffer, C. A. (2013). Data structures and algorithm analysis (Edition 3.2). Retrieved from http://people.cs.vt.edu/~shaffer/Book/JAVA3elatest.pdf

University of Texas at Austin Computer Science Department.  (n.d.).  Complexity analysis Retrieved from http://www.cs.utexas.edu/users/djimenez/utsa/cs1723/lecture2.html

Zeigler, J. (2004). Time, complexity, space complexity, and the O-notation. Retrieved from http://www.leda-tutorial.org/en/official/ch02s02s03.html

Thursday, October 18, 2018

So You Want to Learn Java?

Good for you!  Java is a great place to start learning programming, as it is one of the most popular object-oriented programming languages.  Other object-oriented languages include C#, PHP, Python, and C++.  According to Java T Point (n.d.), object-oriented languages allow a developer to design programs through the use of classes and objects.  Objects are real-world entities that exhibit both states and behaviors (Oracle, 2015).  For example, a dog is an example of an object.  A dog has states such as breed, size, age, etc.  A dog also exhibits behaviors, such as barking, sleeping, running, etc.  New objects are built from the blueprints of a class.  A class contains objects with similar states and behaviors.  For example, a shitzu, a pug, and a german shepard might all belong to the class “dogs”.  The use of objects and classes “simplifies software development and maintenance” (Java T Point, n.d.).

There are a number of reasons why object-oriented programming languages are so widely used.  Most of them can be described by the 4 principles of object-oriented languages: inheritance, polymorphism, abstraction, and encapsulation.  Inheritance is the idea that similar objects can inherit states and behaviors (Oracle, 2015).  In the example above, since a shitzu, a pug, and a german shepard all share similar states and behaviors, each of these objects might fall under a superclass called dogs.  The concept of inheritance allows for code reusability, which simplifies software development.  Polymorphism refers to “multiple methods with the same name, but slightly different functionality” (Raymondlewallen, 2005).  Again, this allows for code reusability since a method, which is like a call to action on an object, can be a modified version of a pre-existing method, rather than requiring a whole new set of code.  Abstraction describes the act of hiding details related to processing (Java T Point, n.d.).  This simplifies the interface and allows code to be more readable.  Lastly, encapsulation refers to the fact that the code is organized into units, such as objects and classes (Java T Point, n.d.).

So how do you get started with Java?  One of the best resources I’ve found is The Java Tutorials.  This resource provides not only the basics of Java, but also how to install the necessary software, and a step-by-step approach to write your first program, “Hello World.”  While this program can be written in a number of different IDE’s (integrated development environments), Oracle recommends the NetBeans IDE.  For steps to install Java, the NetBeans IDE, and write the “Hello World” program, click here.  It is important to note that NetBeans does not support Java JDK version 9 or above, so it is recommended to install JDK 8.  If you do decide to follow these steps, please leave me a comment describing your experience.  Have fun my fellow coder!

Resources:

Java T Point. (n.d.). Java OOPs concepts. Retrieved from http://www.javatpoint.com/java-oops-concepts

Oracle. (2015). "Hello World!" for the NetBeans IDE. Retrieved from https://docs.oracle.com/javase/tutorial/getStarted/cupojava/netbeans.html

Oracle. (2015). The Java tutorials. Retrieved from http://docs.oracle.com/javase/tutorial/index.html

Oracle. (2015). Lesson: Object-oriented programming concepts. Retrieved from http://docs.oracle.com/javase/tutorial/java/concepts/index.html 

Raymondlewallen. (2005, July 19). 4 major principles of object-oriented programming.  Retrieved from http://codebetter.com/raymondlewallen/2005/07/19/4-major-principles-of-object-oriented-programming/

Monday, July 2, 2018

Programming Languages: What are they, and which ones should I know?

Since my goal is to go into software development as a career, I believe having a thorough understanding of programming languages will be essential to my success.  In regards to my experience with programming languages, I have been exposed to both front end, as well as backend programming languages.  Through a weekend-long workshop at the Turing School of Software and Technology, I learned the basics of Ruby as a backend language.  We then touched on HTML, CSS, and Javascript as frontend languages.  The most extensive programming experience that I have lies in the Programming Concepts course I took at Ashford University, which taught Java.   There are a few questions that I focused on during my research into programming languages: 1) What are the best applications of different programming languages?; and 2) Which programming languages will I need to know to make myself marketable in the rapidly changing tech industry?

Programming languages are important because they allow users to execute complex tasks on a device, without needing to know machine language.  Programming languages are used to send commands to the computer, and to simplify the operation of devices (i.e. operating systems).  The instructions for a device are in the code, or software, which is written using a specific programming language.  The programming languages that we reviewed in our textbook are all very different from one another.  Machine language is the lowest level language, and the only language that computers can read and understand.  Machine language is simple binary code made up of ones and zeros.  Assembly language is a step up from machine language in that it reads a bit more like English; however, complex functions are not possible using assembly language.  High-level languages like Python are the most efficient and commonly used today.  They have similarities to English, but can allow for complex functions and logic, including conditional statements and loops (Vahid, 2017).

Understanding the evolution of programming languages is pertinent to understanding how computers operate so efficiently today.  Just as computer hardware has evolved over time, so has computer software, thanks to contributions from scientists, inventors, and mathematicians including Joseph Marie Jacquard, Charles Babbage, Ada Lovelace, Herman Hollerith, Alan Turing, and Grace Hopper (Vahid, 2017).  Computers used to be bulky and simplistic.  In fact, “Computers originated from telephone switches in the early 1900's” (Vahid, 2017).  These switches were used in computers to perform calculations.  A switch could have one of two positions—one or zero.  “A single 0 or 1 is called a bit. 1011 is four bits. Eight bits, like 11000101, are called a byte” (Vahid, 2017).  The specific sequence of bits determines how a computer reads and stores information.  The original switches were very large, and computers had to be manually programmed by adjusting the switches to the configuration required for a computation.  The first computer programs used machine language and punch cards; higher-level languages did not evolve until the 1950’s with the development of languages such as FORTRAN and COBOL: “John Backus introduced FORTRAN, usually considered the first high-level programming language” (Foster, 2017).  In addition, these high-level languages were only created after Grace Hopper wrote the first compiler, which allowed for high level languages to be translated into machine language (Vahid, 2017).

Computers today are much smaller and execute functions much more efficiently than they did in the past.  In fact, “beyond business and personal computing devices like PCs, tablets, and smartphones, computers exist invisibly in nearly anything electrical today too” (Vahid, 2017).  One of the most important pieces of computer hardware today that allows computers to be smaller, while maintaining the speed of operations, is the computer chip.  Rather than using big, bulky switches, computer chips are circuits comprised of transistors and wires, which have become much smaller over time.  According to Vahid (2017), “A 1960 IC (integrated circuit) could hold just a few transistors, but a modern IC can hold billions.”  Since transistors and wires are much smaller today than they were in the past, it is possible to switch between one and zero much more quickly, and the signal does not need to travel as long a distance (Vahid, 2017).  The combination of smaller computer chips, along with the development of high-level programming languages have allowed modern computers to operate much more efficiently.

There are many modern applications of high-level programming languages.  These languages are used to build websites, computer apps, mobile apps, web apps, browsers, operating systems, anti-virus software, and software within embedded computers (Vahid, 2017).  Some of the most common computer applications that have been built include the word processor, spreadsheets, presentation apps, and databases.  These applications allow users to create word documents, organize and analyze data, develop engaging presentations, and retrieve information efficiently from a database.  Without high-level programming languages, these applications would have taken much longer to evolve to what they are today.  In addition, without programming languages, we would not be able to establish network connections to browse the web and retrieve information through the internet.  Programming languages have allowed computers to communicate across a network, sending packets of information back and forth between different IP addresses.  Furthermore, to protect users’ privacy and security, there are now a variety of software packages available to defend personal and business devices against malware (Vahid, 2017).  Again, the use of programming languages makes the development of these protective products a much faster process.

In regards to programming languages today, they are fundamental in the process of software engineering.  According to Guzdial (2018), “a language can usefully constrain and facilitate programmers’ work to improve problem-solving and productivity.”  Will all types of programming languages be used in the future?  To answer this question, machine language is the most basic language of computers, so it will always be used as the foundation that the other languages are built on top of.  Assembly language can be used for basic functions and basic manipulation of data.  High-level languages, such as Python, are used for complex manipulations and large data sets (Vahid, 2017).  High-level languages are the most popular today because they can perform tasks much more quickly than machine language and assembly language.  However, it is important to recognize that some of the programs we have today are relics from the past, which need to be maintained and understood: “Programming is the infrastructure for our world. There are large systems still in use today written in Cobol and PL/1, and we have to maintain that information infrastructure” (Guzdial, 2018).  Therefore, it is important that the knowledge of older programming languages is perpetuated in order to maintain the infrastructure that was first established.

Deciding which programming language to use for a new project can be challenging for today’s software developers.  According to Foster (2018), “Myriad languages have been developed in the last six decades, with at least a few dozen in common usage today.”  The question of which languages software developers should know does not appear to be going away any time soon.  While some may think that one high-level language is sufficient, in a study conducted to compare code quality of projects written in different programming languages in GitHub, here were the results: “While they suggest that the language does indeed matter, almost all of the observed effects are small … except that some particular language features, such as a lack of memory safety, do have profound effects” (Foster, 2018).  In addition, besides narrowing down which language might be best for a given project, it is important to consider that, “in practice the choice of programming language is often constrained both by external factors (for example, the language of existing codebases) and the problem domain (for example, device drivers are likely to be written in C or C++)” (Foster, 2018).  Again, this point emphasizes the need to maintain the knowledge of older languages, upon which existing software was built.  Based on these two views, it is important that software developers know a variety of languages, and can adapt quickly to new languages, dependent upon the need of a project.

In regards to which programming languages are taught in introductory level computer science courses at universities, the data varies from university to university and from year to year.  Even universities cannot seem to agree that one language is better than another to learn first.  According to Siegfried et. al. (2016), “The choice of a programming language for an introductory programming course has been a topic of debate for over forty years, and the academic community has seen a variety of programming languages gain and then subsequently lose popularity.”  Because the data changes so often, Richard Reid, a computer science teacher at Michigan State University, has been tracking data regarding the issue of which language is taught first at universities since the 1990’s.  Every few years he comes out with a new list of the distribution of introductory level programming languages chosen by universities for a given year.  Between the years 2011 and 2015, 83 schools in the study changed the programming language taught in their introductory computer science course, with the largest change being that schools switched from Java to Python.  In 2015, the most popular language taught in introductory level courses was Java, with Python coming in second, and C++ third.  These three languages remained the top three chosen (although with differing orders of popularity) regardless of the region of the US in which the school was located (Siegfried et. al.,2016).  While the debate continues regarding which programming languages software developers should know, evidence suggests that to have an adequate foundation in programming languages and be competitive in today’s job market, it’s important to know at least Java, Python, and/or C++.

Resources
Foster, J. S. (2017). Shedding New Light on an Old Language Debate. Communications Of The
ACM, 60(10), 90. doi:10.1145/3126907

Guzdial, M., & Landau, S. (2018). Programming Programming Languages, and Analyzing
Facebook's Failure. Communications Of The ACM, 61(6), 8-9. doi:10.1145/3204443

Siegfried, R. M., Siegfried, J. P., & Alexandro, G. (2016). A Longitudinal Analysis of the Reid
List of First Programming Languages. Information Systems Education Journal, 14(6), 47-
54.

Vahid, F., & Lysecky, S. (2017). Computing technology for all. Retrieved from
zybooks.zyante.com/

Monday, June 25, 2018

Is My Information Secure?

Information and system security are important in both individual and organizational settings. A breach in security can cause emotional, political, and/or monetary damage, since a breach can allow sensitive information to get into the hands of others. Once information is breached, the owner of the information has little control over where the information ends up or how it is used (Vahid, 2017). For example, most offices in the healthcare industry now use electronic medical records. Healthcare facilities are very diligent about the security of their systems, since patient information is confidential, and a breach of security would violate HIPPA laws. In a different scenario, an individual would not want to accidentally download malware onto their system, which could scrape their computer for personal information.

One type of attack against a system is called a Denial of Service (DoS). A DoS can occur when a destination server becomes overloaded with access requests. One way that hackers achieve this overload is through a botnet. A botnet is a group of computers that have acquired malware. Once the botnet is established, the hacker can command the devices to simultaneously send large quantities of access requests to a destination on the web (Vahid, 2017). Normally, a ping is a tool used to diagnose network and connection problems by sending test packets of data to determine the roundtrip time and whether or not the packets successfully arrive at their destination. However, in this scenario, hackers may use the ping command maliciously to send access requests. When there are too many access requests received at once, the server becomes overloaded and cannot respond to valid access requests from real users. Therefore, the user experiences a “denial of service.” Their access request cannot be granted, and they cannot access the destination. In some cases, the overload can even cause a server to crash.

One of the biggest issues related to information and system security right now are security holes, or vulnerabilities. Hackers target these vulnerabilities to gain access to a system and steal information or disrupt the functioning of an organization. According to Vahid (2017), “Security holes commonly exist in operating systems. Once discovered, OS makers update the OS to close such holes. Thus, computer users are advised to keep their OS'es up-to-date, not only to gain new features, but to close security holes.” Therefore, individuals can help protect themselves by keeping the software on their personal devices updated. However, users cannot necessarily control the security of their information when hackers target vulnerabilities in large companies. For example, in 2016 there was a DDoS attack on Netflix: “intentionally overwhelming a service such as Netflix with potentially millions of simultaneous hits can crash a provider’s servers, and this was one of the biggest attacks in recent history,” (Weber, 2017). These security breaches are damaging because hackers can obtain the personal information of users. According to Weber (2017), “Large data breaches such as what happened at Yahoo can have large financial rewards for the hackers, who then just sell the data in the underground black market. Even if your information is worth only a few pennies, the theft of a million records can pay back the hackers quite nicely for their efforts.” This shows that while large corporations employ methods to keep their systems secure, they are still vulnerable to attack. Therefore, it is important to only provide as much personal information as needed when creating new accounts.

Another major issue involving information security is phishing. This strategy of illegally obtaining data incorporates the use of spam e-mail to trick a user into providing sensitive information, such as account numbers, social security numbers, and/passwords. Often times the e-mail will appear to come from a legitimate company, such as a bank. Typically, the e-mail will include a link that appears to take the user to their account log-in page and will have the user enter their information. However, even though the link appears legitimate, the site that it opens is often an imitation site used to steal the victim’s information (Vahid, 2017). Users can avoid these scams by refraining from clicking on links in e-mails. A user can instead go to the company’s actual website and log-in to check for notifications and updates regarding their account. Phishing scams are constantly evolving, especially as smartphones become more common. One of the newer types of phishing scams is actually referred to as smishing, which comes from SMS, or “short message service.” In this scam, “You get a fake text saying there’s a problem with one of your financial accounts. Or maybe a message offering a low-cost mortgage, a discount cruise, or a free gift card. If you respond by text, the scammer will know that the number is viable and may contact you to try to get more sensitive personal information. If you click on the link in the text directly, the scam artist may be able to install malware that can collect personal information (account numbers, passwords, etc.) from your phone,” (Hickey, 2018). Con artists also phish for personal information via direct phone calls pretending to be a “bank, creditor, insurance company, or government agency,” (Hickey, 2018). Often the scammers use scare tactics to get the victim to divulge personal information before they have time to process whether or not the call is a scam, so it is recommended to never provide personal information over the phone if you don’t know who’s calling. Instead, hang up and look up the phone number, or try calling it back to verify that it is legitimate.

Resources

Hickey, M. C. (2018). Protect Yourself From These 7 Scams. (cover story). Consumer Reports,
83(6), 26-33.

Vahid, F., & Lysecky, S. (2017). Computing technology for all. Retrieved from
zybooks.zyante.com/

Weber, R. M., & Horn, B. D. (2017). Breaking Bad Security Vulnerabilities. Journal Of
Financial Service Professionals, 71(1), 50-54.

Computer Literacy Necessary in Today's Workforce?: An Evaluation of Technology in Education

Computer literacy is extremely important in the education industry.  Whether one is an instructor, student, principal, or advisor, understanding how computers and applications work is essential to thriving in education.  As a high school and middle school teacher for 5 years, I experienced differences in technology capabilities at various schools.  Three of the schools where I taught had one-to-one technology initiatives.  One of these schools allowed students to use any device they wanted, while the other schools limited device options.

I experienced first hand the challenge of teaching students who themselves don’t understand how to utilize the technology they have access to.  In one school I found myself teaching students how to right click, copy and paste, open new browser tabs, etc. in a Biology class, because they didn’t have basic computer literacy skills.  In addition, I was forced to stop lessons to troubleshoot various issues on multiple types of devices, or to commandeer our IT team, who always had their hands full.

Two other schools where I worked did not have one-to-one initiatives, but instead had computer carts, which floated from classroom to classroom.  Again, the use of computers in these classrooms was commonplace, so students needed computer literacy in order to succeed in the given tasks.  Students used Google classroom, Microsoft Office, and other applications to support their learning.  In addition, as a long-term substitute teacher at one of these schools, I taught a STEM class that used Lego Mindstorms robotics software, which I had to learn on the fly.

The use of computers in elementary, middle, and high schools continues to grow, since computers provide another dimension to learning and tend to engage students who might otherwise be “checked out.”  Students even use smart phones in some classrooms to engage in real time assessments and interactive learning activities.  Therefore, teachers, principals, students, and staff members need to have computer literacy to keep up with the increasing technology demand in schools.

In regards to higher education, it is near impossible to be employed without basic computer skills.  Many colleges and universities now provide online education opportunities for students, so faculty and staff must be prepared to use the technology that students interact with on a daily basis for their courses.  In addition, as an enrollment advisor, I train students to use their online classrooms, Student Portal, and communicate with their instructors.  I also utilize multiple databases to access student information and keep my daily activities organized.  If I didn’t have basic technology skills, there is now way that I would be able to complete the tasks necessary for my job.

The future of education is in technology.  Technology provides opportunities for remote learning, as well as free learning for those who have the focus and determination to teach themselves new skills.  I see the future of educational technology moving more towards cloud-based applications, since the cloud allows access from multiple devices, collaboration on documents, and also provides opportunities for greater storage.  In addition, I believe that in the next ten years, there will be as many opportunities for online degree programs in higher education as there are for traditional degree programs on campuses.  I presume there will always be a digital divide as a result of limited access to resources in disadvantaged communities, but I also believe that strides are being taken to bridge that gap and improve public access to technology.

Resources

Vahid, F., & Lysecky, S. (2017). Computing technology for all. Retrieved from zybooks.zyante.com/

Troubleshooting: Diagnosing a Network Connection

Packets of data travel from one destination to another through the network by transmission between routers.  The ping command reveals whether a destination is available, and how quickly it can be accessed.  The traceroute command reveals the path that packets of data traveled from one router to the next.  Each time a packet is sent from one router to another, that is considered a hop.  There are multiple paths that a packet can take to get to the same destination, depending on the routers that are available along the way.

In comparing the ping command for the three different websites that I used, all three pings successfully sent 4 packets of information.  In addition, the amount of data sent in all three pings was 32 bytes.  This allows for easy comparison of round trip time between the three websites.  The average round trip time for the packets when pinging google.com (USA) was 12 ms.  The average round trip time for packets sent to amazon.co.uk (United Kingdom) was 133 ms.  The average round trip time for packets sent to amazon.ca (Canada) was 79 ms.  This suggests that the farther the geographic distance, the longer the round trip time for packets to be sent.

On the other hand, the traceroute command told a different story.  The only traceroute that was successfully completed was for google.com.  Although I tried running the command multiple times for both amazon.co.uk, as well as amazon.ca, the commands always timed out before the information was received.  This result confused me, since a command timeout can indicate packet loss.  However, all of the pings were successful.  I’m not sure how to reconcile the difference in the results between the pings and the traceroutes other than hypothesizing that if the command were to have enough time to run, the traceroute would be completed successfully, but since the distance the packets have to travel is so far (geographically), the command times out before the packets can travel that distance.

Pings and traceroutes can be used to diagnose connection problems in a few ways.  A failed ping request may tell the user that the destination is not available.  In addition, the ping request can tell the user the speed at which packets are traveling.  A traceroute can tell the user the path through which packets are traveling.  If a user is not able to access a destination, they may run a traceroute to pinpoint where in the path the data is being lost.  Since the traceroute shows the routers associated with each hop, the specific router that is failing may be determined.  In some cases, running a ping or traceroute may even reveal that the connection problem lies in the original device.  Cases where a traceroute may timeout or a ping may fail include when the geographic distance traveled is too far, or if a destination server is down.

To view pings and trace routes, please click on the following link:
Pings and Traceroutes

Resources

Vahid, F., & Lysecky, S. (2017). Computing technology for all. Retrieved from zybooks.zyante.com/

Ping to google.com
Ping to amazon.co.uk
Ping to amazon.ca
Traceroute to google.com
Traceroute to amazon.co.uk
Traceroute to amazon.ca


Which Office Application is Best?

Recently, I completed an assignment in which I explored the functions of various Microsoft Office applications.  My experience using Word, Excel, PowerPoint, and Access for the assignment was pretty uneventful. As a teacher for 5 years, I used every one of those applications, with the exception of Access, almost every day. I especially used PowerPoint to deliver science content to my students. Access is new to me, as I have never used that application before, and I had a bit more difficulty thinking of ways that the application could be used in my own life. I use a database for work, since I need a way to manage new leads and the students that I work with, but instead of Access, we use
Talisma and Campus Vue (two student management database systems).

Each of the applications that we used have different functions. The function of the word processor application is to create a text document with formatting, and the ability to add style using font families, bold, italics, underlining, pictures, etc. The function of the spreadsheet is to input, sort, and keep track of data. The function of the presentation application is to create engaging ways to deliver information using graphics and animations. The function of the database is to create queries to gather specific pieces of information.

The advantage of the word processor is that most of the formatting and stylizing functions can be found directly in the tool bar at the top of the page. This application allows a user to easily type a word document, and then very quickly make changes to the font size and family, add bulleted or numbered lists, or change the paragraph and page formatting options. However, Word would not be a great application for presenting information or analyzing data. The spreadsheet application is easy to use because once data has been entered, the user can easily create a graph or chart to analyze the data. On the other hand, Excel would not be the best application for presenting information to an audience or for writing an essay, since the data is arranged in the format of a table. The presentation application, PowerPoint, is easy to use because a user can add multiple slides, use a template to display information in the most visually appealing way, and add animations at the click of a button to increase engagement in the presentation. However, PowerPoint would not be the application of choice when writing an essay, since a user would want a single document, not multiple slides. The database application’s advantage is that it allows the user to quickly pull data according to specific records or fields. The database application allows communication between multiple tables of data to quickly acquire and organize information. On the other hand, this application would not be great for presentations, since it does not use images or animations to keep the attention of the audience.

The application that would be the best for documenting information about my day depends on the intended use of the information. Word might be the application of choice if I were writing a short story about my day that could later be published on a blog. Excel would be the best application if I wanted to track how much time I spend doing each activity per day over the period of a year. PowerPoint would be the ideal application if I were going to present about my day to an audience. Access might be the best application if I were going to compare the amount of time I spent doing activities throughout my day to other students at Ashford University, whose information was also in the database.

Another use for the Word application, aside from documenting my day, would be if I wanted to write a book. Word would allow me to quickly type the document, do a spell check, change formatting and font, and use the word count feature if needed. I could use Exel if I were conducting an investigation that studies how quickly bacteria grow on a petri dish. This application would allow me to record quantitative data relative to the rate of bacterial growth. I could then also use Excel to create a line graph showing the growth of the bacteria over time. PowerPoint could be used if I needed to present a training at work regarding how to track where a student is in the enrollment process. Access might be used if I were to create a Pinterest store and I needed to gather information about my inventory, customers, or orders. All in all, there is a purpose for every application, but the best application depends on the intended use of the information in the end.

Resources

Vahid, F., & Lysecky, S. (2017). Computing technology for all. Retrieved from
zybooks.zyante.com/

What's Your Favorite Mobile App?

One of my favorite mobile apps is Instagram.  I use it on a daily basis to post a photo-a-day.  I love photography, but about two years ago, I realized that I was taking tons of photos, yet I didn’t have the urgency or discipline to cull, edit, and publish them regularly.  My new year's resolution in 2017 was to post a photo-a-day on both Twitter and Instagram (https://twitter.com/aliciapiavis and https://www.instagram.com/aliciapiavis/).  This allowed me to practice editing every single day of the year.  I maintained this goal for the whole year, and after realizing how beneficial having that structure was to me, I decided to continue the project this year as well.  Posting a photo-a-day provides me with the routine I need to regularly work on and improve my photography.

I chose Instagram to post my photos because I wanted to use an app that would easily show a history of my posts.  I felt that my photos would get lost amongst my status posts on Facebook, and I knew that Twitter is utilized more for following celebrities, news, and other media.  I already had an Instagram account that I had not been using, so I simply changed the purpose of my account to solely serve the photo-a-day challenge.  I learned that Instagram is fairly easy to use.  The app can be downloaded from both the Apple App Store, as well as Google Play.  As soon as the user opens the app, they can see the newsfeed of recently added photos from the other users they follow.  In addition, the landing page has a button right in the middle of the bottom task bar with a plus sign, which allows the user to add a new photo.  The user can then choose if they want to upload a photo from their cell phone’s photo gallery, take a new photo, or take a video.

I personally never use the photo option, since 99% of the photos that I upload were previously shot with my DSLR camera and then edited in Lightroom.  I have a slightly unique process of uploading my photos, since I also post the same photo to Twitter.  I post my photo to Twitter from my computer.  Then, I open the Twitter app on my phone, open the photo I just posted, and then save it to the gallery in my phone.  Then I simply open the Instagram app, click the “+” sign, choose “Gallery”, and then click on the photo that I just saved.  Since the photo is already edited, I skip the editing feature on Instagram, and go straight to the “Share” page where I write my comment, and then share it.  Basically, I can post a photo from the Instagram app on my phone in less than a minute-- great usability.

I love the design of the app because the layout is very simple.  When the user open the app, their newsfeed loads first, showing all the recent posts of their followers.  There are only five buttons on the bottom taskbar.  The user can easily go to their newsfeed by clicking the home icon.  They can search content with keywords using the magnifying glass icon.  They can see their notifications for people who have liked or commented on their photos by using the heart icon, and they can see their own feed by selecting the person icon.  In addition, if I user wants to upload a photo and use a filter, there are many filters built in to change the look of the image in the tap of a button.

In regards to functionality, the app has a few functions, but they all revolve around one thing-- photos.  One of the functions that Instagram offers is the ability for a user to edit their photo or apply a filter before publishing it, and the user can also write a comment about the image.  In addition, users can also “tag” their photos to appear in keyword searches or to be placed into a specific category.  This function can allow a user’s content to be more easily discovered by other users.  For example, if I were to search “mountains” in the app, I would receive all results which were tagged with #mountains.  I can also sort the results by “Top” (most popular), “People” (names of users), “Tags”, or “Places” (the location where the photo was taken).  Another function of the app is that it is linked to other social media apps, including Facebook, Twitter, and Tumblr, so the user can easily post the photo to multiple sources at once.  Another function of the app is the ability to choose who the user wants to follow.  For example, I love travel, nature, and wildlife photography, so many of the users I follow post similar content.  I can also go directly to their Instagram feed to see all of the posts that they have created.

As far as recommendations go, there are three areas for which I see areas of improvement.  The first is that Instagram only allows photos of a certain aspect ratio to be posted.  For example, some of the photos I have wanted to post are panoramas.  They post just fine as a panorama in Twitter.  However, as soon as I try to add the photo to Instagram, the platform immediately crops the image, and cuts off the edges of the panorama.  The same goes for portrait photos.  If I try to post a portrait oriented photo on Instagram, either the top of bottom will get cut off.  Another area of improvement applies to the web version of the app.  Unfortunately users cannot create new posts through the web version of the app.  New posts can only be created through the mobile app.  Lastly, I wish there was a way to get rid of robots.  If there was a robot to get rid of robots, that would be great!  Every now and then I see a notification that someone has commented on one of posts.  I have a fleeting moment of excitement, until I realize that it was just a robot asking me to follow them.  Other than those three elements, I love Instagram because it’s easy to use, the design layout is very simple, and it has the majority of functions that I need.


Friday, June 22, 2018

Scratch:My Frustrating Attempt to Write Block Code Using a Children's Program...

I tried Scratch (a children's program that teaches block programming language) for the first time ever!  Here's what I came up with from my sorry first attempt...

Penguin Man Program (click the link to see the program): https://scratch.mit.edu/projects/227437790/

My experience building the Scratch program was actually quite frustrating. It didn’t help that I had just came off of a long week in which I was wrapping up my previous course, while beginning my current one (I am doing concurrent enrollment for school). Therefore, the weekend before my Scratch assignment for class was due, I was tied up completing my Medical Terminology final, a 2-day (Saturday and Sunday) coding workshop through Turing School of Software and Technology, as well as trying to complete assignments and the reading for my INT100 course. I also work full time. Honestly, I wish I would have had more time and energy to explore Scratch sooner, but due to an usually busy schedule the past week, I was not able to explore the program until the night it was due. 

The Scratch software was not very intuitive to me, even though I read the user manual first. The manual was helpful for basic tips, but I found that creating an animation with very little brain power left at the end of the day was rather difficult. In addition, I couldn’t figure out why actions seemed to be occurring so quickly, almost as if they were compiled on top of one another. I realized that using the “Wait” block was helpful in between actions to slow down the process. In addition, it took me a while to figure out how to lengthen the duration of the drum beats—eventually placed the drum beat inside of the repeat block to create more beats. Also, I realized that “Point in direction __” was very different than “Turn __ degrees.” I found it helpful to click on blocks separately to observe their actions before adding them to the stack. In the end, the result was a very basic animation stemming from a struggle with both creativity and patience. 

This exercise taught me that trial and error can be a useful method in programming. In addition, I learned that the steps in a program occur sequentially, and that they need to be specific. I also learned that programming takes patience and perseverance, and that I might not get my desired outcome the first time. In the real world, it would be helpful to have a plan or design first, and then to write the code that fulfills that plan. 

My experience programming with Scratch was a bit more frustrating than my experience with the participation activities in my course textbook. I think that was partially due to the fact that the textbook asks for a desired outcome, whereas with Scratch I had no objective other than to create a program using 30 blocks with a variety of components. The Scratch program was very open-ended and required us to design a program, along with writing the code. I caught on pretty quickly to the participation exercises in the book because they were very straightforward and had a desired outcome, but I can see how in the bigger scheme of things, languages like machine language would be very impractical. 

The programming languages that my class reviewed in our textbook are all very different from one another. Machine language is the lowest level language, and the only language that computers can read and understand. Machine language is simply binary code made up of one and zeros. Assembly language is a step up from machine language in that it reads a bit more like English; however, complex functions are not possible using assembly language. High-level languages like Python are the most efficient. They have similarities to English, but can allow for very complex functions and logic, including conditionals and loops. 

Honestly, I found Python language the easiest to use in the participation exercises found in the textbook. The binary threw me off a bit when trying to use machine language and assembly language. It took me some time to see the connection between the “input”, “output”, “add”, “start” and the corresponding binary code. In addition, it took some time for me to conceptualize that a variable is a location. I think that the programming exercises in Python came the easiest to me because I have a little bit of experience in Java, Ruby, HTML, CSS, and JavaScript. Therefore, Python was the most familiar to me. When I was trying to program in Scratch, I understood that the logic was similar to Python and other high-level languages, but some of the blocks of code in Scratch performed actions differently than I predicted. In addition, I felt that I didn’t have enough time to fully understand all of the icons and nuances of the software to use it effectively. 

In regards to use, machine language is the most basic language of computers, so it will always be used as the foundational language that the others are built on top of. Assembly language can be used for basic functions and manipulations of data. High-level languages like Python can be used for complex manipulations of data and large data sets. I believe that high-level languages are the most popular today because they can perform tasks much more quickly than machine language and assembly language. Therefore, productivity in an organization can be increased.  

Resources 

Vahid, F., & Lysecky, S. (2017). Computing technology for all. Retrieved from zybooks.zyante.com/