...Market Efficiency and the Johannesburg Securities Exchange Table of Contents 1. Abstract 3 2. Introduction 4 3. The Johannesburg Securities Exchange 4 3.1. History 4 3.2. Function 5 4. The Efficient Market Hypothesis 5 4.1. Strong From 6 4.2. Semi-strong form 6 4.3. Weak form 7 4.4. Random Walk Hypothesis 8 5. Empirical evidence 9 5.1. Joint Hypothesis Problem 10 5.2. Capital Asset Pricing Model 11 5.3. Empirical evidence on investor overreaction 12 6. Comparisons to international stock markets 13 7. Conclusion 15 9. Bibliography 16 1. Abstract The JSE is a securities exchange based in South Africa and is considered to be the largest on the African continent. More than 400 stocks are traded on the JSE and as a result, it is important that investors are aware of the relevant information regarding stocks, which would enable investors to make sound investments. The Efficient Market Hypothesis is used to ascertain whether certain stocks and their respective prices in a particular market reflect all necessary information, which would illustrate an efficient market (Fama, 1970). Carrado and Jordan (2000) supports the aforementioned statement by affirming that markets are efficient in terms of sources of specific information, on condition that information is not exploited to earn above average returns. Furthermore, Fama (1965) explained the efficiency of markets and their stock prices by analyzing the three forms of...
Words: 5133 - Pages: 21
...Any piece of literature whither it be a famous play from Shakespeare or a research paper from a random college student has a purpose. Some purposes are to entertain, express themselves, educate, or in the case of "Politically Incorrect? Or Master Strategists? Try Both" by Kareem Abdul-Jabbar; it was to persuade. However, Abdul-Jabbar tried to hide his attempt to persuade people as an article that was supposed to seem informative and educational on the surface. To gather credibility for a series of weak and biased arguments Kareem Abdul-Jabbar misuses statistics and manipulates human emotions. Abdul-Jabbar’s creative and controlling writing is subtlety integrated unless examined further. Abdul-Jabbar’s most powerful tool that he uses throughout his article is his ability to appeal to people’s emotions. He uses quotes that bring attention by appealing to peoples emotions and also makes people feel comfortable with him as a person; this way people are more likely to agree with him. Abdul-Jabbar does this multiple times early on in his article in an effort to build a sense of importance with his...
Words: 1235 - Pages: 5
...Fin 700 Tianqi Sun Dr. Al. Barzykowsi Dec. 19, 2015 Short Paper - Statistical Methods This paper talks about statistical methods. Statistical data indicates that the agency 's approach is characterized by its population by inference Presented from a representative sample of the population views. As scientists rarely observed throughout Crowd, sampling and statistical inference is essential. This paper discusses some of the general principles Visualization of planning experiments and data. Then, a strong focus on the appropriate choice Standard statistical models and statistical inference methods. First of all, the Standard Model described. These models, in order to apply interval estimation and hypothesis testing parameters Also described, including the next two sample cases, when the purpose of comparing two or more of the population For their means and variances. Secondly, non-parametric inference tests are also described in the case where the data Sample distribution is not compatible with standard parameter distribution. Thirdly, using multiple resampling methods Computer -generated random sample finally introduced the characteristics of the distribution and estimate Statistical inference. The method of multivariate data processing of the following sections involved. method Clinical trials also briefly review process. Finally, the last section of statistical computer software discussion And through the collection of citations to adapt to different levels of expertise...
Words: 702 - Pages: 3
...activity” (Brockopp & Hastings-Tolsma, 2003, p. 59). A well-thought-out design allows for assurance that the evidence has practicality. Literature Review A thorough literature review allows for credibility of the study. The literature review provides the foundation for the study’s significance and relationship to practice. The literature review is generally summarised in the introductory section or under a specific heading such as a review of the literature (Polit and Hungler 1997). Reference to original sources is important as information can be taken out of context and used inappropriately therefore an abundance of secondary sources should be viewed with caution as they may not provide sufficient detail or possibly distort some aspects of the original research (Polit and Hungler 1997; Burns and Grove 1993). The purpose of the literature review is to discuss what is known, identify gaps in knowledge, establish the significance of the study and situate the study within the current body of knowledge (Polit and Hungler 1997). This is supported by Burns and Grove (2001), who consider the primary purpose of reviewing the literature is to gain a broad background or understanding of the available information...
Words: 1099 - Pages: 5
...Céline Polidori Physics SL Mr. White 12.12.13 Focal Length Research Question: What is the focal length of a convex length when calculated graphically? Variables: Independent | Dependent | Controlled | * The position of the lens | * The distance from the object (u) and the image (v) to the lens | * Object * Lens | Materials: Method: 1. Gather materials and set up experiment as in diagram * Choose a set object and lens that will not be changed throughout the experiment as they are the controlled variables * Make sure that the ruler is aligned with the wick of the candle and the paper as best as possible to reduce uncertainties in the distance * Align the center of the lens with the wick of the candle. 2. Place the lens at 200mm from the candle 3. Move the paper along the ruler until a clear image is obtained 4. Record measurements 5. Repeat experiment at least 3 times every 20mm along the ruler. Data: Data showing the image and object distance (mm) from the lens ±30mm | Object distance (u) | Image distance (v) | | 1 | 2 | 3 | Average - 10% | 200 | 610 | 624 | 616 | 617 | 220 | 445 | 447 | 464 | 452 | 240 | 392 | 395 | 396 | 394 | 300 | 295 | 291 | 290 | 292 | 400 | 245 | 241 | 240 | 242 | 500 | 208 | 214 | 215 | 212 | Uncertainties: * Ruler: ±5mm * Lens thickness: ±5mm * Image position until clear: ±20mm * Uncertainty for the image and object distance = ±30mm ...
Words: 893 - Pages: 4
...Sampling Techniques Psychology 341 August 11, 2013 ABSTRACT The present research paper was designed to discuss the different types of sampling methods used to conduct research in the field of Psychology. The sampling techniques included in this paper are probability sampling, non probability sampling, surveys and questionnaires. The use of examples for each type of technique is given to further the understanding of each specific type. Furthermore, some the most important aspects that should considered before selecting a method are outlined in detail. Sampling Techniques When conducting research, it is almost impossible to study the entire population that we are interested in looking at more in depth. For example, if we were interested in comparing the level of romantic satisfaction among college students in the United States, it would be practically impossible to survey every single person who is attending college in the country. Not only would it take an extremely long time to do so, but it would also be very expensive. That is why researchers will use small samples from the population to gather their data instead. A sample is particularly useful because it allows the researcher to make inferences about a specific population without having to actually survey the entire population (Trochim, 2006). There are several sampling techniques used to gather information about a sample. Some of these include probability sampling, non probability sampling, surveys, and questionnaires...
Words: 1536 - Pages: 7
...Website Analysis Paper Website Analysis Paper I am going to evaluate the page layout, navigation, and performance based on my knowledge of www.worlds-worst-website.com, website that from my knowledge was designed to be a bad designed website. However I for this paper I will treat it as if it was a website to give facts or information about things on the internet or world. I well discuss what the website has as content, who was this content made for, the way it navigates, and how it can be improved. This website is really bad so my goal is to try to come up for a way this website is bearable. There are meaning things wrong with this website that make it bad. What is on www.worlds-worst-website.com? Well there are many links on this site that take you to different websites. Two of the links send you to same website just to two different pages of that site. One like seems to take you to a game that has something to do with Obama. One is just a video of cats. Then there are links on different dinosaurs and one link about bicycles. There is also a link that makes you take a quiz of a show. This sites content is well everywhere. There is no actual main content on this site. There is just random links to different sites and it makes the site more like a share link site if anything. I am also going to say that the websites it sends you to are better than this site; however I still do not think they are better or really look that impressive. The look really old and need to...
Words: 787 - Pages: 4
...Security Issues in Mobile Computing Srikanth Pullela Department of Computer Science University of Texas at Arlington E-mail: pvssrikath@hotmail.com Abstract In the present mobile communication environment, lot of research is going on, to improve the performance of issues like handoffs, routing etc. Security is another key issue that needs to be considered, which comes into picture once the communication channel is setup. Many security protocols are being proposed for different applications like Wireless Application Protocol, 802.11 etc. most of them are based on the public and private key cryptography. This paper provides an insight on these cryptographic protocols and also looks into the current research project going on at Sun Microsystems Lab on wireless security. 1.Introduction With the rapid growth in the wireless mobile communication technology, small devices like PDAs, laptops are able to communicate with the fixed wired network while in motion. Because of its flexibility and provision of providing ubiquitous infrastructure, the need to provide security increases to a great degree. As wireless communication takes place mainly through the radio signals rather than wires, it is easier to intercept or eavesdrop on the communication channels. Therefore, it is important to provide security from all these threats. There are different kinds of issues within security like confidentiality, integrity, availability, legitimacy, and accountability that needs...
Words: 4692 - Pages: 19
...Econometrics of Random Walk Hypothesis ABSTRACT The random walk hypothesis is a key instrument used in the analysis of forecasting in the economic and financial market. It is used primarily in the forecasting of the prices of stocks. This is useful to determine and forecast the prices of stocks given previous stock prices. This paper discusses the basis of the hypothesis, the two types of random walk hypothesis, its framework, methodologies and the analysis of its repercussions. INTRODUCTION The random walk hypothesis states that stock price changes have the same distribution and are independent of one another, so the past movement or trend of a stock price or of the market as a whole cannot be used to predict its future price or any possible future trends. The concept originated in the late 1800s from Jules Regnault, a French broker, and Louis Bachelier, a French mathematician, whose Ph.D. dissertation titled "The Theory of Speculation". The same ideas were later developed and studied further by Paul Cootner, an MIT Sloan School of Management professor, in his 1964 book The Random Character of Stock Market Prices. The term was popularized by the 1973 book, A Random Walk Down Wall Street, by Burton Malkiel, a professor of economics at Princeton University, and was used earlier in Eugene Fama's 1965 article "Random Walks In Stock Market Prices”. The theory that stock prices move randomly was earlier proposed by Maurice Kendall in his 1953 paper, “The Analytics of Economic Time...
Words: 2111 - Pages: 9
...Neanderthals are extinct members of the homo neanderthalensis species (Wong 99). They lived on the European continent about 20,000 to 40,000 years ago. The Neanderthals are very similar looking to humans in their physical appearance. They are even considered to be humans, before we humans came along. When we humans are compared to Neanderthals we are considered to be modern humans. The extinction of Neanderthals could have been caused by many reasons. But scientists believe that the cause of extinction is due to evolutionary forces. There are a few evolutionary forces that may have taken place in the extinction of the Neanderthals. But if we, modern humans, are still alive wouldn’t the species that are similar to us also be alive? This means that the Neanderthals were different from modern humans and evolutionary forces had a role to play. Modern humans and the Neanderthals came from the same linage. Humans did not develop from the Neanderthal. The Neanderthals weren’t studied until a skull was found in Germany (Wong 99). They were classified as a whole different species because of their structural differences. But it turns out that their differences were not so major at all. Compared to the early modern European, Neanderthals had a receding forehead, strong brow ridge, and no chin. The early modern European had steeper forehead, delicate brow ridge, and a chin (Wong 100). The Neanderthals adapted to their environment. An example would be their big bodies that were built as...
Words: 765 - Pages: 4
...Random numbers in C++ and The Pythagorean Theorem Name Course Date Random numbers in C++ and The Pythagorean Theorem Introduction Computer programs in light of the technological advances that have been made, arguably make up for the most important concepts in such developments. A set of instructions designed to assist a computer to prefer a given task is referred to as a computer program. There are numerous languages used to create/design computer for instance Java Script, Java, C++, SQL and Sage (Laine, 2013). Computer programming is defined as a process of developing a working set of computer instructions meant to aid the computer in the performance of a given task. Computer programming starts with the formulation of a valid computer problem. This process is then followed by the development of an executable computer program, for instance Firefox Web Brower (Laine 2013). It is worth noting that there are other programs in the same realm. Computer programming is a diverse field that is of utmost importance in the modern world, especially with the continuous expansion of the internet. Perhaps the relevance of this can be underlined by the fact that computer programming has carved out as a course on itself. Computer programming is offered under several courses studied in colleges and universities (Laine, 2013). Computer programming is not only for computer students but for all who use computers on a day to day basis. This is by extension everyone since the...
Words: 9330 - Pages: 38
...Solving reader collision problem in large scale RFID systems : Algorithms, performance evaluation and discussions John Sum, Kevin Ho, Siu-chung Lau Abstract—Assigning neighboring RFID readers with nonoverlapping interrogation time slots is one approach to solve the reader collision problem. In which, Distributed Color Selection (DCS) and Colorwave algorithm have been developed, and simulated annealing (SA) technique have been applied. Some of them (we call them non-progresive algorithms), like DCS, require the user to pre-defined the number of time slots. While some of them (we call them progressive), like Colorwave, determine the number automatically. In this paper, a comparative analysis on both non-progressive and progressive algorithms to solve such a problem in a random RFID reader network is presented. By extensive simulations on a dense network consisting of 250 readers whose transmission rates are 100%, a number of useful results have been found. For those non-progressive type algorithms, it is found that DCS is unlikely to generate a collision-free solution, even the number of time slots is set to 20. On the other hand, heuristic and SAbased algorithms can produce collision-free solutions whenever the number of time slots is set to 16. For the cases when the number of time slots is not specified, heuristic-based, SAbased and Colorwave algorithms are all able to determine the number automatically and thus generate collision-free solution. However, SA-based algorithms require...
Words: 6608 - Pages: 27
...Final Paper Introduction There are many legal and ethical issues that we face in the world today. Business are the ones who are hit hard by these issues and need to find ways to identify them while also doing what they can to prevent any loss of business as a cause. One of the examples that this paper will talk about is the issue of drug testing. Drug testing is a sensitive issue as there are many different viewpoints regarding this issue. Drug testing also known as drug screening was used increasingly in the 1990’s to test for the presence of illegal narcotics in the blood or urine of the employees. An employee abusing illegal narcotics may be impaired and thus at a greater risk of injury or illness on the job (Kesselring & Pittman, 2002). A previous company I used to work for had many issues regarding drug testing and it sometimes came back to hurt them. Drug testing has become an issue of outstanding social concern across the country and has been used by many employers (Wall, 1992). This paper will explain the issue that drug testing causes as well as some of the ethical concerns raised by the situation. This paper will also explain the laws that effect drug testing as well as recommendations to reduce liability exposure and improve the ethical climate of the situation Description of Business that presents a legal and ethical issue Drug testing was a big concern of a company I used to work for in San Diego. Sterling Security was a company that was bought out in 2004...
Words: 2133 - Pages: 9
...policing also create a focus on accountability, which has caused some police officers to respond negatively to such pressures (Willis, Mastrofski & Weisburd, 2003). The purpose of this paper is to briefly examine predictive policing and how tools such as COMPSTAT allow police departments to respond more efficiently to criminal activity. Information Technology vs. Random Patrols Before discussing specific issues involving the use of COMPSTAT as part of predictive policing, it is important to compare and contrast the use of information technology as a way of optimizing police department performance with the more traditional use of random street patrols. The use of information technology applications and performing random street patrols are actually similar because both methods of identifying and responding to crime are intended to examine the events and conditions that exist in a particular area so that criminal activity can be prevented before it actually occurs (Bratton & Malinowski, 2008). In essence, regardless of whether information technology applications or random street patrols are used, the overall goal is to be positioned in a way that fast responses to criminal activity can occur, but also that crime might actually be stopped before it occurs. However, an important contrast between the use of information technology and random street patrols involves the way in which resources are used. With the use of random street patrols, the ability to prevent crime is largely based...
Words: 1291 - Pages: 6
...tests such as the Augmented Dickey Fuller test and a panel unit root test. Additionally the existence of random walk for these stock markets has also been examined through the Jarque-Bera statistic. The results indicate information inefficiency in the time period under study for all indices. Investors can therefore predict future prices on the basis of historical information, and receive excessive returns. The results have implications for developing economies wherein the government has to ensure that all asset related information be made public, to curb state interference. Introduction The concept of Efficient Market Hypothesis (EMH) holds special importance in the field of Finance, especially Capital markets. This hypothesis postulates that markets are informationally efficient. This asserts that the price of any security will fully reflect all the information that is available to the investors. That being said, one cannot consistently achieve returns that are excess of the average market returns on a risk-adjusted basis, with information available at the time of investment. First developed separately by Eugene F. Fama and Paul A. Samuelson, the concept assumes that the investors need not be rational. In an efficient market, investors may either overreact or under react to newly available information. The investor’s reactions are random, such that the price changes are random as well....
Words: 6344 - Pages: 26