Pigeon Feed
View unanswered posts | View active topics It is currently Wed Jan 17, 2018 1:57 am



Reply to topic  [ 22 posts ]  Go to page Previous  1, 2, 3  Next
 Algorithms 
Author Message
User avatar

Joined: Thu Mar 31, 2011 4:00 pm
Posts: 10071
Post Re: Algorithms
In mathematics, an average is a measure of the "middle" or "typical" value of a data set.[citation needed] It is thus a measure of central tendency.

In the most common case, the data set is a list of numbers. The average of a list of numbers is a single number intended to typify the numbers in the list. If all the numbers in the list are the same, then this number should be used. If the numbers are not the same, the average is calculated by combining the numbers from the list in a specific way and computing a single number as being the average of the list.

Many different descriptive statistics can be chosen as a measure of the central tendency of the data items. These include the arithmetic mean, the median, and the mode. Other statistics, such as the standard deviation and the range, are called measures of spread and describe how spread out the data is.

The most common statistic is the arithmetic mean, but depending on the nature of the data other types of central tendency may be more appropriate. For example, the median is used most often when the distribution of the values is skewed with a small number of very high or low values, as seen with house prices or incomes. It is also used when extreme values are likely to be anomalous or less reliable than the other values (e.g. as a result of measurement error), because the median takes less account of extreme values than the mean does.

The arithmetic mean, or simply the mean or average when the context is clear, is the central tendency of a collection of numbers taken as the sum of the numbers divided by the size of the collection.

The median is the middle number of the group when they are ranked in order. (If there are an even number of numbers, the mean of the middle two is taken.)

Thus to find the median, order the list according to its elements' magnitude and then repeatedly remove the pair consisting of the highest and lowest values until either one or two values are left. If exactly one value is left, it is the median; if two values, the median is the arithmetic mean of these two. This method takes the list 1, 7, 3, 13 and orders it to read 1, 3, 7, 13. Then the 1 and 13 are removed to obtain the list 3, 7. Since there are two elements in this remaining list, the median is their arithmetic mean, (3 + 7)/2 = 5.


Fri Nov 30, 2012 11:03 pm
Profile
User avatar

Joined: Thu Mar 31, 2011 4:00 pm
Posts: 10071
Post Re: Algorithms
Harmonic mean for a non-empty collection of numbers a1, a2, ..., an, all different from 0, is defined as the reciprocal of the arithmetic mean of the reciprocals of the ai's:

One example where it is useful is calculating the average speed for a number of fixed-distance trips. For example, if the speed for going from point A to B was 60 km/h, and the speed for returning from B to A was 40 km/h, then the average speed is given by

2 / (1/60 + 1/40) = 48


I realize this is what we do, in a different way, to compute the average speed for a series of bird races.

(Total distance in miles * (1760 "yards per mile" * 60 "minutes per hour")) / total seconds = average yards per minute

IE: (60 * 105600) / 3600 = 1760 ypm

60 miles flown in 1 hour


Fri Nov 30, 2012 11:50 pm
Profile
User avatar

Joined: Thu Mar 31, 2011 4:00 pm
Posts: 10071
Post Re: Algorithms
The Doomsday rule or Doomsday algorithm is a way of calculating the day of the week of a given date. It provides a perpetual calendar since the Gregorian calendar moves in cycles of 400 years.

This algorithm for mental calculation was devised by John Conway[1][2] after drawing inspiration from Lewis Carroll's work on a perpetual calendar algorithm.[3][4] It takes advantage of the fact that each year has a certain day of the week (the doomsday) upon which certain easy-to-remember dates fall; for example, 4/4, 6/6, 8/8, 10/10, 12/12, and the last day of February all occur on the same day of the week in any given year. Applying the Doomsday algorithm involves three steps:

  • Determine the "anchor day" for the century.
  • Use the anchor day for the century to calculate the doomsday for the year.
  • Choose the closest date out of the ones that always fall on the doomsday (e.g. 4/4, 6/6, 8/8), and count the number of days (modulo 7) between that date and the date in question to arrive at the day of the week.

This technique applies to both the Gregorian calendar A.D. and the Julian calendar, although their doomsdays will usually be different days of the week.

Since this algorithm involves treating days of the week like numbers modulo 7, John Conway suggests thinking of the days of the week as Noneday, Oneday, Twosday, Treblesday, Foursday, Fiveday, and Six-a-day.

The algorithm is simple enough for anyone with basic arithmetic ability to do the calculations mentally. Conway can usually give the correct answer in under two seconds.

It's useful to note that Christmas Day is always the day before Doomsday ("One off Doomsday"). In addition, July 4 is always on a Doomsday, and so is Halloween.

Link


Zeller's congruence is an algorithm devised by Christian Zeller to calculate the day of the week for any Julian or Gregorian calendar date.


Sat Dec 01, 2012 1:16 am
Profile
User avatar

Joined: Thu Mar 31, 2011 4:00 pm
Posts: 10071
Post Re: Algorithms
Vincenty's formulae are two related iterative methods used in geodesy to calculate the distance between two points on the surface of a spheroid, developed by Thaddeus Vincenty (1975a) They are based on the assumption that the figure of the Earth is an oblate spheroid, and hence are more accurate than methods such as great-circle distance which assume a spherical Earth.

The first (direct) method computes the location of a point which is a given distance and azimuth (direction) from another point. The second (inverse) method computes the geographical distance and azimuth between two given points. They have been widely used in geodesy because they are accurate to within 0.5 mm (0.020″) on the Earth ellipsoid.

Link


This inverse formula seems to be more accurate than what my Magellan GPS displays. This is based on the published distance for 1 degree of latitude and longitude. The formula calcs match that number.


Sat Dec 01, 2012 5:25 pm
Profile
User avatar

Joined: Mon Apr 11, 2011 6:55 pm
Posts: 6494
Post Re: Algorithms
Good brain food.


Sun Dec 02, 2012 10:17 pm
Profile
User avatar

Joined: Mon Apr 11, 2011 6:55 pm
Posts: 6494
Post Re: Algorithms
The Reverse-Delete Algorithm is an algorithm in graph theory used to obtain a minimum spanning tree from a given connected, edge-weighed graph. If the graph is disconnected, this algorithm will find a minimum spanning tree for each disconnected part of the graph. The set of these minimum spanning trees is called a minimum spanning forest, which contains every vertex in the graph.

This algorithm is a greedy algorithm, choosing the best choice given any situation. It is the reverse of Kruskal's algorithm, which is another greedy algorithm to find a minimum spanning tree. Kruskal’s algorithm starts with an empty graph and adds edges while the Reverse-Delete algorithm starts with the original graph and deletes edges from it. The algorithm works as follows:

Start with graph G, which contains a list of edges E.
Go through E in decreasing order of edge weights.
For each edge, check if deleting the edge will further disconnect the graph.
Perform any deletion that does not lead to additional disconnection.


Sun Dec 09, 2012 11:30 am
Profile
User avatar

Joined: Thu Mar 31, 2011 4:00 pm
Posts: 10071
Post Re: Algorithms
...
On the face of it, algorithms – “step-by-step procedures for calculations” – seem unlikely candidates for the role of tyrant. Their power comes from the fact that they are the key ingredients of all significant computer programs and the logic embedded in them determines what those programs do. In that sense algorithms are the secret sauce of a computerised world.

And they are secret. Every so often, the veil is lifted when there’s a scandal. Last August, for example, a “rogue algorithm” in the computers of a New York stockbroking firm, Knight Capital, embarked on 45 minutes of automated trading that eventually lost its owners $440m before it was stopped.

But, mostly, algorithms do their work quietly in the background.
...
PageRank thus gives Google awesome power. And, ever since Lord Acton‘s time, we know what power does to people – and institutions. So the power of PageRank poses serious regulatory issues for governments. On the one hand, the algorithm is a closely guarded commercial secret – for obvious reasons: if it weren’t, then the search engine optimisers would have a field day and all search results would be suspect. On the other hand, because it’s secret, we can’t be sure that Google isn’t skewing results to favour its own commercial interests, as some people allege.

Besides, there’s more to power than commercial clout. Many years ago, the sociologist Steven Lukes pointed out that power comes in three varieties: the ability to stop people doing what they want to do; the ability to compel them to do things that they don’t want to do: and the ability to shape the way they think. This last is the power that mass media have, which is why the Leveson inquiry was so important.

But, in a way, algorithms also have that power. Take, for example, the one that drives Google News. This was recently subjected to an illuminating analysis by Nick Diakopoulos from the Nieman Journalism Lab. Google claims that its selection of noteworthy news stories is “generated entirely by computer algorithms without human editors. No humans were harmed or even used in the creation of this page.”

The implication is that the selection process is somehow more “objective” than a human-mediated one. Diakopoulos takes this cosy assumption apart by examining the way the algorithm works. There’s nothing sinister about it, but it highlights the importance of understanding how software works. The choice that faces citizens in a networked world is thus: program or be programmed.

Link


Mon Dec 17, 2012 12:00 am
Profile
User avatar

Joined: Mon Apr 11, 2011 6:55 pm
Posts: 6494
Post Re: Algorithms
"Program or be Programmed"

Choose your programmers wisely.


Mon Dec 17, 2012 12:53 am
Profile
User avatar

Joined: Thu Mar 31, 2011 4:00 pm
Posts: 10071
Post Re: Algorithms
Royal wrote:
"Program or be Programmed"

Choose your programmers wisely.


I choose bnonymous.


Mon Dec 17, 2012 1:41 am
Profile
User avatar

Joined: Mon Apr 11, 2011 6:55 pm
Posts: 6494
Post Re: Algorithms
the people have spoken.


Mon Dec 17, 2012 1:59 am
Profile
Display posts from previous:  Sort by  
Reply to topic   [ 22 posts ]  Go to page Previous  1, 2, 3  Next

Who is online

Users browsing this forum: No registered users and 1 guest


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
cron
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group.