Identifying irrelevant input variables in chaotic time series problems: Using the genetic algorithm for training neural networks

Size: px
Start display at page:

Download "Identifying irrelevant input variables in chaotic time series problems: Using the genetic algorithm for training neural networks"

Transcription

1 Idenifying irrelevan inpu variables in chaoic ime series problems: Using he geneic algorihm for raining neural neworks Randall S. Sexon 1 Ball Sae Universiy Deparmen of Managemen Muncie, Indiana rssexon@mail.bsu.edu Work: Fax: Randall Sexon is an Assisan Professor a Ball Sae Universiy. Dr. Sexon is suppored in par by he George A. Ball Disinguished Research Fellowship program.

2 1 Idenifying irrelevan inpu variables in chaoic ime series problems: Using he geneic algorihm for raining neural neworks ABSTRACT Many researchers consider a neural nework o be a "black box" ha maps he unknown relaionships of inpus o corresponding oupus. By viewing neural neworks in his manner, researchers ofen include many more inpu variables han are necessary for finding good soluions. This causes unneeded compuaion as well as impeding he search process by increasing he complexiy of he nework. The main reason for his raional is he dependence upon gradien echniques, ypically a variaion of backpropagaion, by he vas majoriy of neural nework researchers for nework opimizaion. Since gradien echniques are incapable of idenifying unneeded weighs in a soluion, researchers have no been able o deermine conribuing inpus from hose ha are irrelevan. By using a global search echnique, he geneic algorihm, for neural nework opimizaion, i is possible o idenify unneeded weighs in he nework model, which allows for idenificaion of irrelevan inpu variables. This paper demonsraes hrough an inensive Mone Carlo sudy, ha he geneic algorihm can auomaically reduce he dimensionaliy of neural nework models during nework opimizaion. The geneic algorihm is also direcly compared wih backpropagaion neworks o show effeciveness for finding global versus local soluions. KEY WORDS: Neural Neworks, Geneic Algorihm, Backpropagaion, Learning, Generalizaion, Opimizaion

3 2 1. INTRODUCTION A major goal of NN research is o find he soluion weighs ha no only work well for he raining daa (in-sample daa), bu also generalize well for inerpolaion daa (ou-of-sample). For any neural nework (NN) model, as he number of inpus and hidden nodes increase, he complexiy of solving hese models does so as well. By correcly idenifying he relevan variables and hidden nodes in a NN model, researchers can reduce he complexiy and dimensionaliy of he problem, enhancing he probabiliy for finding soluions ha generalize o ou-of-sample observaions. Generalizaion refers o he abiliy of he NN o forecas esimaes from paerns ha have no been seen by he nework. By decreasing he number of weighs in he NN, he algorihm is forced o develop general rules o discriminae beween he inpu paerns. The problem of producing NNs ha beer generalize has been widely sudied (Burki [1991], Corell, Girard, Girard, & Mangeas [1993], Drucker & LeCun [1992], Fahlman & Lebiere [1990], Holmsröm & Koisinen [1992], Kamimura [1993], Karmin [1990], Kruschke [1989], Lendaris & Harls [1990], Romaniuk [1993]). I has been shown ha generalizaion is beer in smaller neworks (Baum & Haussler [1989], Schiffman, Joos, & Werner [1993]) Much of his pas research aemps o produce a parsimonious NN soluion by pruning echniques. Pruning echniques progressively reduce he size of a large nework by eliminaing weighs in he NN. The weighs ha are eliminaed fall ino wo caegories for gradien raining algorihms, acive and near inacive. Near inacive weighs are hose ha have decayed close o zero. Removing hese weighs, since hey are arbirarily close o zero, will likely have lile effec on he abiliy of he NN o generalize. If weighs are pruned ha are no close o zero, or acive weighs, he NN will need considerable reraining (Lee [1997]). In his paper, he geneic algorihm (GA) is used simulaneously o reduce he errors beween esimaes and real oupu

4 3 values and he number of connecions in he NN. By doing so, he GA can find an opimal NN archiecure as well as idenifying irrelevan inpu variables in he NN model. The majoriy of neural nework researchers use a gradien echnique, ypically a variaion of backpropagaion (Rumelhar & McClelland [1986]), for neural nework opimizaion. A problem ha ofen occurs when using backpropagaion (BP) is he loss of generalizaion power. There are wo main reasons why his occurs, over parameerizaion and local convergence. Over parameerizaion is simply including more weighs in he soluion han are necessary o esimae he funcion. By including addiional weighs in he soluion, he degrees of freedom are reduced. However, here is currenly no known mehod for calculaing he exen of he reducion. When over parameerizaion occurs, he nework ends o memorize he raining daa, which decreases he generalizaion abiliy of he nework. Unneeded weighs are added o he model by including irrelevan inpu variables or unneeded hidden nodes. Currenly, no effecive mehods for idenifying hese unneeded weighs are known using gradien echniques. Local convergence also conribues o a decrease in generalizaion abiliy. BP, by is very naure, converges o local soluions. Alhough, BP could possibly converge upon a local soluion ha is global, i is unlikely because of he complex error surfaces generaed when opimizing NNs. To improve he generalizaion abiliy of NNs, an appropriae global search mehod is needed ha will idenify unneeded weighs in he soluion, resuling in a parsimonious NN soluion. The geneic algorihm (GA) is proposed as an appropriae global search echnique ha is no limied o derivaive based objecive funcions for NN opimizaion. For a limied number of problems, he GA was shown significanly o ouperform BP for NN raining (Sexon, Dorsey, & Johnson [1998]). The purpose of his paper is o show he effeciveness of he GA o search for an opimal parsimonious NN soluion for a se of chaoic ime series problems. Alhough, he

5 4 GA has been used in pas research for idenificaion of relevan inpu variables and NN archiecure for BP rained neworks, his research is unique in ha i also uses he GA for raining he NN. The benefis ha resul from his search algorihm include, a reducion in nework complexiy, idenificaion of irrelevan inpu variables, and increased generalizaion power of he NN soluion. The following secion includes a general descripion of he geneic algorihm. Secion 3 describes he Mone Carlo sudy. Secion 4 provides he resuls of he comparison followed by final remarks and conclusions in Secion THE GENETIC ALGORITHM Research combining geneic algorihms and neural neworks began o appear in he mid o lae 1980s. More han 250 references can be readily found in lieraure oday. A survey of ha research can be found in Schaffer e al. [1992]. The wo primary direcions of his pas research include using he GA o improve he performance of BP by finding opimal neural nework archiecures and/or parameer seings, or as an alernaive o BP for opimizing he nework. This paper combines hese pas research direcions focusing on he use of he GA as an alernaive o BP, which can also find opimal NN archiecures. Mos of he pas research using he geneic algorihm for nework opimizaion has found ha he GA is no compeiive wih he bes gradien learning mehods. I has recenly been shown however, (see Sexon, Dorsey and Johnson [1998]) ha he problem wih his research is in he implemenaion of he GA and no is inabiliy o perform he ask. For example, he majoriy of his research encodes each candidae soluion of weighs ino binary srings. This approach works well for opimizaion of problems wih only a few variables. For neural neworks wih large numbers of weighs his binary encoding resuls in exremely long srings.

6 5 As a resul, he paerns ha are essenial o he GA's effeciveness are virually impossible o mainain wih he sandard GA operaors such as crossover and muaion. A more effecive approach is o allow he GA o operae over real valued parameers. Examples of his approach can be found in (Monana & Davis [1989], Sexon e al. [1998]). Alhough, Monana & Davis successfully ouperformed BP using he GA, heir specific implemenaion of he geneic algorihm resuled in he crossover operaor causing excessive loss of informaion abou he Schema of he parameers. These Schemas were influenial in he prior generaion selecion of he curren generaion's srings and herefore he loss of his informaion reduces he effeciveness of he search process. The alernaive approach described in he Sexon e al. [1998] paper also successfully ouperformed BP on a variey of problems. This line of research in based on he algorihm developed by Dorsey & Mayer [1995]. As opposed o BP, he GA is a global search procedure ha searches from one populaion of soluions o anoher, focusing on he area of he bes soluion so far, while coninuously sampling he oal parameer space. The GA has recenly been shown o perform excepionally well a obaining he global soluion when opimizing difficul nonlinear funcions (Dorsey & Mayer [1994], Dorsey & Mayer [1995]). An exension of his research has also shown he GA o perform well for opimizing he NN, anoher complex nonlinear funcion (Sexon, Dorsey, & Johnson [1998], Dorsey, Johnson, & Mayer [1994]). Unlike BP, which moves from one poin o anoher based on gradien informaion, he GA simulaneously searches in many direcions, which enhances he probabiliy of finding he global opimum. Figure 1 illusraes a simple ouline of he GA used in his sudy while he erms used and parameer seings are briefly described in he following paragraphs. A formal descripion of he algorihm can be found in Dorsey & Mayer [1995].

7 6 Figure 1. Ouline of he Geneic Algorihm Iniializaion: Choose an iniial populaion conaining 20 soluions o be he curren populaion. Each soluion consiss of a sring of weighs ha are plugged ino he NN. Compue he objecive funcion value for each soluion in he populaion. Evaluaion: Each member of he curren populaion is evaluaed by a finess funcion based on heir objecive funcion value o assign each soluion a probabiliy for being redrawn in he nex generaion. Reproducion: A maing pool of 20 soluions is creaed by selecing soluions from he curren populaion based on heir assigned probabiliy. Crossover: The soluions in he maing pool are hen randomly paired consrucing 10 ses of paren soluions. A poin is randomly seleced for each pair in which he paren soluions will swich he weighs ha are preceding ha poin, generaing 20 new soluions or he nex generaion. Muaion: For each generaion, a small probabiliy of any weigh from he curren populaion can be replaced by a randomly drawn value in he enire weigh space. Muaion2: For each generaion, a small probabiliy of any weigh from he curren populaion can be replaced by a hard zero. Terminaion. The algorihm will erminae on a user-specified number of generaions. Similar o BP, he GA randomly draws values in order o begin he search. However, unlike BP, which only draws weighs for one soluion, he GA will draw weighs for a populaion of soluions. The populaion size for his sudy is se o 20, which is user defined and is based on pas research (Dorsey & Mayer [1995]). Once he populaion of soluions is drawn, he global search begins wih his firs generaion. Each of he soluions in he populaion is hen evaluaed based on a preseleced objecive funcion, which is no necessarily differeniable. Since our objecive is o find a global soluion

8 7 ha eliminaes unneeded connecions in he model, i is necessary o include an objecive funcion ha is no differeniable. The objecive funcion chosen for his sudy is shown in Equaion 1. The goal of his objecive funcion is o find a NN soluion ha reduces he sum of squared errors as well as he number of non-zero weighs. (1) E = N i= 1 ( O i ) i 2 + C N i= 1 ( O i N i ) 2 Where N is he number of exemplars in he daa se, O is he oupu value, is he NN esimae, and C is he number of non-zero weighs in he soluion. Alhough, his objecive funcion seems o work well for he problems in his sudy, he penaly value assigned for non-zero connecions is arbirary. Addiional research, beyond he scope of his sudy, is warraned for finding an opimal penaly assignmen and is lef for fuure research. Once he soluions in he populaion are evaluaed, a probabiliy is assigned o each soluion based on he value of he objecive funcion. For example, using he chosen objecive funcion, he soluions ha resul in he smalles error erm are assigned he highes probabiliies. This complees he firs generaion. The second generaion is hen randomly drawn based on he assigned probabiliies from he former. For example, he bes soluions (ones wih he smalles errors and herefore highes assigned probabiliies) in he firs generaion are more likely o be drawn for he second generaion of 20 soluions. This is known as reproducion, which parallels he process of naural selecion or "survival of he fies. The soluions ha are mos favorable in opimizing he objecive funcion will reproduce and hrive in fuure generaions, while poorer soluions die ou.

9 8 Before he soluions in he second generaion can be evaluaed, wo processes mus ake place. They are crossover and muaion. This new populaion, which only includes soluions ha exised in he prior generaion, will be randomly grouped ino pairs of soluions. For each pair of soluions, a random subse of weighs is chosen and swiched wih is paired soluion (crossover). For example, if a soluion conains 10 weighs, a random ineger value is drawn from one o 10 for he firs pair of soluions, for his example les say five is seleced. Every weigh above he 5h weigh is now swiched beween he paired soluions, resuling in wo new soluions for each pair. Once his is done for each pair of soluions, crossover is complee. In order o sample he enire parameer space, and no be limied only o hose iniially random drawn values from he firs generaion, muaion mus occur. Each soluion in his new generaion now has a small probabiliy ha any of is weighs may be replaced wih a value uniformly seleced from he parameer space (muaion). If muaion occurs, he likelihood of his new weigh surviving in he nex generaion is based on he probabiliies assigned when he new soluion is applied o he objecive funcion. For example, if he soluion now has a lower error value because of his new muaed weigh, he soluion conaining his weigh will now have a higher probabiliy of being drawn in he nex generaion, as well as a lower probabiliy of being drawn if i causes he error o increase. To allow he GA o idenify irrelevan weighs in he soluions, an addiional process (muaion2) was included. Each soluion in his new generaion now has a small probabiliy ha any of is weighs may be replaced wih a hard zero. Once replicaion, crossover, muaion, and muaion2 have occurred he new generaion can now be evaluaed in order o deermine he new probabiliies for he nex generaion. This process coninues unil he iniial populaion evolves o a generaion ha will bes solve he opimizaion problem, ideally he global soluion.

10 9 3. MONTE CARLO DESCRIPTION 3.1. Relevance of he problems Deerminisic chaoic dynamical sysems, such as he funcions used in his sudy, are of grea ineres o researchers because of heir comparabiliy o he chaoic behavior of economic and financial daa. Chaos as discussed in his research is a special ype of sysem, which is capable of exhibiing complex, ofen aperiodic, behavior in ime. Recenly here has been grea ineres in deermining wheher cerain financial and economic ime series are beer described by linear sochasic models or are appropriaely characerized by deerminisic chaos. Empirical research has been hampered in deecing he presence of chaos in such ime series due o he apparen randomness of his ype of daa. While he irregulariy of such variables as GNP, employmen, ineres rae, and exchange raes have generally been aribued o random flucuaions, he abiliy of even simple deerminisic chaoic models o produce complex ime pahs ha appear o be random has araced aenion as a possible alernaive explanaion. Taking his ino consideraion, he significance of accuraely esimaing such chaoic behavior is apparen and is he reason his ype of daa is used for his sudy Chaoic ime series problems The following chaoic ime series problems were included for his sudy. Problems 1-5 were aken from chaos lieraure. Problems 1-4 were aken from Schuser [1995] and he 5h problem, a modified version of he Mackey-Glass equaion from Gallan and Whie [1992]. To es he GA s abiliy o deermine irrelevan variables in larger problems, he Mackey-Glass equaion (problem 5) was modified o incorporae addiional independen inpu variables.

11 10 Alhough, he funcions are known, he NNs used o esimae hese funcions do no use his informaion bu solely rely on he generaed daa for raining purposes. Chaoic ime series problems 1) X = 4X (1 X ) 2) X 3) X 4) X 5) X 6) X 7) X = 4X 2 = X 2 = X = X = X = X X X X 0.2X X X 0.1X X 12 + X i 0.1X 10 1 i= X Daa generaion Fory daa ses of 100 observaions each were generaed for each of he seven problems. Each daa se was iniialized wih a random number drawn from a uniform disribuion in order begin he ime series. To show he abiliy of he GA o idenify irrelevan variables in he NN model, an addiional irrelevan inpu variable was added o each of he daa ses. The 40 daa ses were basically spli ino wo experimens. For half of he daa ses (20 daa ses) for each problem, an addiional irrelevan variable was included ha consised of a randomly drawn value from a uniform disribuion. The second half of he daa ses (20 daa ses) for each problem, included irrelevan inpu variables ha consised of an addiional ime lag. In boh cases he irrelevan inpu variables had no effec on he acual oupu variable. For each problem and irrelevan inpu variable ype (random or lag), 10 daa ses were used for raining and 10 for esing. Problems five and six are ineresing because hey already include irrelevan lag variables. For problems five and six he GA will have o idenify four and 14 irrelevan

12 11 variables, respecively. Problem seven incorporaes all 12 lag variables ino he oupu. The only irrelevan variable included is he addiional inpu variable added o he daa ses Measures and comparisons The main heme of his paper is o demonsrae he GA s abiliy o idenify irrelevan variables in he NN model. This is done by firs idenifying unneeded weighs in he NN model. Irrelevan variables are hen deermined by simply inspecing he soluion. Those inpus ha have zeros for all connecion weighs are hen deermined as irrelevan o he model. The performance measuremen for he GA will include a percenage of he correcly idenified irrelevan variables for all replicaions and problems. Alhough, he irrelevan variable idenificaion is he major heme of his sudy, i is meaningless if he soluions found are inferior o more sandard used mehods of NN opimizaion. For his reason, a comparison of wo variaions of he BP algorihm is compared wih he GA rained neworks, based on Roo Mean Squared Error (RMSE) and CPU ime in seconds Training wih he geneic algorihm The GA and corresponding parameers used for his sudy were se o values recommended by Dorsey and Mayer [1995]. The only parameer seleced was he number of generaions for raining. For his sudy, 3,000 generaions (60,000 epochs) was deermined o be sufficien for finding superior soluions over hose of BP. Alhough, he nework could have rained furher wih more generaions, i was no necessary o demonsrae he effeciveness of a global search algorihm. In each case he GA had no converged bu was sopped afer he specified number of generaions. If he GA were allowed o coninue, i will converge arbirarily

13 12 close o he global soluion bu he addiional search ime was unnecessary for he comparison. The number hidden nodes included in he GA opimized NNs was deermined by an auomaic pre-run of each daa se. For each run, he NN archiecure sared wih only one hidden node and rained for 100 generaions. Once he raining erminaed he bes error was saved and one addiional hidden node was added o he nework. Training commenced for 100 addiional generaions. This process coninues unil a nework is found ha generaes an error ha is larger han he previous. Once his occurs, he NN archiecure is se o he number of hidden nodes ha generaed he bes error. Alhough, he number of generaions (100) used o deermine he NN archiecure is se arbirarily, his value seems o work sufficienly well for hese problems. Furher research is needed o find an opimal epoch value for hese runs Training wih backpropagaion Two variaions of BP were used as a baseline comparison wih he GA resuls in order o show he GA s abiliy o idenify unneeded weighs and heir corresponding irrelevan variables, and o show is abiliy o find superior soluions. The firs BP variaion includes he Cascade Correlaion (Cascor) algorihm developed by Fahlman and Lebiere [1990]. This algorihm builds a NN opology while simulaneously idenifying he appropriae connecion weighs. Alhough, his algorihm aemps o deermine he correc number of hidden nodes o include in he NN model, i sill lacks he abiliy o idenify specific weighs ha are no needed. The second variaion of BP ha was used included he sandard BP algorihm wih hree differen archiecures. This se of runs included hree, four, and five hidden nodes for raining on all daa ses. For boh variaions he learning rae (sep value), momenum, and learning rae coefficien raio were se o 1.0, 0.9, and 0.5, respecively. The learning rae coefficien raio basically

14 13 reduced he learning rae and momenum erm by 0.5 every epochs. This was done o help eliminae oscillaions and effecively o converge upon a soluion. An epoch is defined as one complee pass hrough he raining se. Each of he four BP configuraions rained for 250,000 epochs for each daa se. 4. RESULTS The GA rained on 20 differen replicaions for each problem, which included 10 replicaions ha had a random irrelevan variable and 10 replicaions ha had an addiional lag irrelevan variable, oaling 140 differen NNs. Ou of hese 140 replicaions he GA correcly idenified all or 100% of he irrelevan variables, including he hree addiional irrelevan variables in problem five and 13 irrelevan variables for problem six. Figure 2 illusraes he connecions of a GA rained NN for one of problem 5 s replicaions (irrelevan lag daa se). As can be seen in his figure, he GA correcly idenified he wo lags ha conribued o he oupu variable, while eliminaing he connecions o he irrelevan variables. Figure 2 NN archiecure for replicaion 1, problem 5 Oupu Hidden bias

15 14 To make his finding significan a comparison was made o demonsrae he abiliy of he GA o show is superioriy for finding global soluions. Table 1 and 2 shows he average RMSE of he es ses for he GA and four configuraions of BP, for replicaions conaining random irrelevan inpu variables and lag irrelevan inpu variables. Table 1 - Average RMSE for replicaions conaining random irrelevan inpu variables Problem GA BPC BP3 BP4 BP E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E-01 BPC = Cascor algorihm BP3 = 3 hidden BP4 = 4 hidden BP5 = 5 hidden Table 2 - Average RMSE for replicaions conaining lag irrelevan inpu variables Problem GA BPC BP3 BP4 BP E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E-01 BPC = Cascor algorihm BP3 = 3 hidden BP4 = 4 hidden BP5 = 5 hidden

16 15 As can be seen in hese ables he GA finds soluions ha generae superior soluions. Alhough, i is apparen ha he GA ouperforms all four BP configuraions based on RMSE, a saisical es is needed o show significan differences beween hese soluions. A saisical comparison of es resuls was conduced using he Wilcoxon mached-pairs signed ranks (2- ailed P significance) es. This es is designed o es a hypohesis abou he locaion of a populaion disribuion. I does no require he assumpion ha he populaion be normally disribued and is used in place of he one sample -es when he normaliy assumpion is quesionable. More informaion on he Wilcoxon mached-pairs signed ranks es can be found in (Conover, W. J. [1980]). The bes esimaes for boh he GA and BP were used for he es. The rouine from SPSS for Windows, sofware package was used. The GA soluions dominaed he BP soluions in every es se a he 99% level of significance. Boh he GA and he Cascor variaion of BP aemped o find opimal NN archiecures. Tables 3 and 4 illusrae he average number of hidden nodes found for each problem across all replicaions. Since he archiecures for he sandard variaion of BP was saic, here was no need for addiional comparisons. Alhough, he Cascor variaion of BP found similar opimal srucures for problems 1-5 and seven, his algorihm failed o ouperform he GA for hese problems. For problem six, which included several more irrelevan inpu variables, he Cascor algorihm included a much larger number of hidden nodes han ha of he GA. Also, since Cascor is gradien based, his algorihm, unlike he GA, was incapable of idenifying unneeded weighs in he NN soluions.

17 16 Table 3 - Average number of hidden nodes Random irrelevan variables Lag irrelevan variables Problems GA BPC GA BPC The final comparison beween he GA and he four BP configuraions was based on CPU ime in seconds. All runs were made on a 200-MHz Penium pro worksaion using he NT 4.0 operaions sysem. Tables 4 and 5 illusrae he ime differences for hese runs. I can be seen, from hese ables ha he GA no only finds global parsimonious soluions, bu does so in an efficien manner. Alhough, he GA was premaurely erminaed, furher raining would only have resuled in beer forecass. Table 4 - Average CPU ime comparisons (random irrelevan inpu) Problem GA BPC BP3 BP4 BP

18 17 Table 5 - Average CPU ime comparisons (lag irrelevan inpu) Problem GA BPC BP3 BP4 BP CONCLUSIONS Neural neworks offer researchers a highly versaile ool for esimaion. Unforunaely, in pas research his ool has been limied by using gradien echniques for opimizaion. By using an appropriae global search echnique, like he geneic algorihm, many of he limiaions of gradien echniques can be eliminaed. I has been shown in his inensive Mone Carlo sudy ha he GA was able effecively o idenify 100% of he irrelevan inpu variables for hese chaoic ime series problems. The significance of irrelevan inpu variable idenificaion is found in he addiional informaion ha i gives o researchers, as well as he improvemen of generalizaion of neural nework models. I was also shown ha compared wih BP, he GA was able o significanly find superior soluions in an efficien manner. Hopefully, hese resuls will generae ineres in fuure neural nework research ha will build upon he findings in his sudy.

19 18 REFERENCES: Baum, E. B., & Haussler, D. [1989]. Wha size ne gives valid generalizaion? Neural Compuaion, 1, Burki, A. N. [1991]. Opimisaion of he archiecure of feed-forward neural nes wih hidden layers by uni eliminaion, Complex Sysems, 5, Corell, M., Girard, B., Girard, Y., & Mangeas, M. [1993]. Time series and neural nework: A saisical mehod for weigh eliminaion, In M. Verleysen (Ed.), European Symposium on Arificial neural Neworks ( ). Brussels: D. faco. Conover, W. J. [1980]. Pracical nonparameric saisics, 2nd ed. New York: John Wiley & Sons. Dorsey, R. E., Johnson, J. D. & Mayer, W. J. [1994]. "A geneic algorihm for he raining of feedforward neural neworks," Advances in Arificial Inelligence in Economics, Finance, and Managemen (J. D. Johnson and A. B. Whinson, eds.). (Vol.1). JAI Press Inc., Greenwich, CT, Dorsey, R. E. & Mayer W. J. [1995]. "Geneic algorihms for esimaion problems wih muliple opima, non-differeniabiliy, and oher irregular feaures," Journal of Business and Economic Saisics, 13(1), Dorsey, R. E. & Mayer, W. J. [1994]. "Opimizaion using geneic algorihms," Advances in Arificial Inelligence in Economics, Finance, and Managemen (J. D. Johnson and A. B. Whinson, eds.). (Vol.1). JAI Press Inc., Greenwich, CT, Drucker, H., & LeCun, Y. [1992]. Improving generalisaion performance using double backpropagaion, IEEE ransacions on Neural Neworks, 3, Fahlman, S. E., Lebiere, C. [1990]. "The cascade-correlaion learning archiecure," Advances in Neural Informaion Processing Sysems, (Vol. II). Morgan Kaufmann, San Maeo, CA, Gallan, A. R., & Whie, H. [1992]. "On learning he derivaives a an unknown mapping wih mulilayer feedforward neworks," Arificial neural neworks: Approximaion and learning heory. (H. Whie, eds.) Blackwell Publishers, Cambridge, MA, Holmsröm, L., & Koisinen, P. [1992]. Using addiive noise in backpropagaion raining, IEEE Transacion on Neural Neworks, 3, Kamimura, R. [1993]. Inernal represenaion wih minimum enropy in recurren neural neworks: Minimizing enropy hrough inhibiory connecions, Nework Compuaion in Neural Sysems, 4,

20 19 Karmin, E. D. [1990]. A simple procedure for pruning backpropagaion rained neworks, IEEE Transacion on Neural Neworks, 1, Kruschke, J. K. [1989]. Disribued bolenecks for improved generalizaion in backpropogaion neworks, Inernaional Journal of Neural Neworks Research and Applicaions, 1, Lee, C. W. [1997]. Training feedforward neural neworks: An algorihm giving improved generalizaion, Neural Neworks, 10(1), Lendaris, G. G. & Harls, I. A. [1990]. Improved generalizaion in ANN s via use of concepual graphs: A characer recogniion ask as an example case, Proceedings IJCNN-90 ( ). Piscaaway, NJ: IEEE. Monana, D. J., & Davis, L. [1989]. "Training feedforward neural neworks using geneic algorihms," Proceedings of he Third Inernaional Conference on Geneic Algorihms, Morgan Kaufmann, San Maeo, CA, Romaniuk, S. G. [1993]. Pruning divide and conquer neworks, Nework: Compuaion in Neural Sysems, 4, Rumelhar, D. E., & McClelland, J. L. (Eds.). [1986]. Parallel disribued processing, (Vol. 1). MIT Press, Cambridge, MA. Schaffer, J. D., Whiley, D., & Eshelman, L. J. [1992]. "Combinaions of geneic algorihms and neural neworks: A survey of he sae of he ar," COGANN-92 Combinaions of Geneic Algorihms and Neural Neworks, IEEE Compuer Sociey Press, Los Alamios, CA, Schiffman, W., Joos, M., & Werner, R. [1993]. Comparison of opimized backpropagaion algorihms, In M. Verleysen (Ed.), European Symposium on Arificial Neural neworks, (97-104). Brussels: D. faco. Schuser, H. [1995]. Deerminisic chaos: An inroducion, VCH, Weinheim, New York. Sexon, R. S., Dorsey, R. E., & Johnson, J. D. [1998]. "Toward a global opimum for neural neworks: a comparison of he geneic algorihm and backpropagaion," Decision Suppor Sysems, 22(2),

Neural Network Model of the Backpropagation Algorithm

Neural Network Model of the Backpropagation Algorithm Neural Nework Model of he Backpropagaion Algorihm Rudolf Jakša Deparmen of Cyberneics and Arificial Inelligence Technical Universiy of Košice Lená 9, 4 Košice Slovakia jaksa@neuron.uke.sk Miroslav Karák

More information

Fast Multi-task Learning for Query Spelling Correction

Fast Multi-task Learning for Query Spelling Correction Fas Muli-ask Learning for Query Spelling Correcion Xu Sun Dep. of Saisical Science Cornell Universiy Ihaca, NY 14853 xusun@cornell.edu Anshumali Shrivasava Dep. of Compuer Science Cornell Universiy Ihaca,

More information

An Effiecient Approach for Resource Auto-Scaling in Cloud Environments

An Effiecient Approach for Resource Auto-Scaling in Cloud Environments Inernaional Journal of Elecrical and Compuer Engineering (IJECE) Vol. 6, No. 5, Ocober 2016, pp. 2415~2424 ISSN: 2088-8708, DOI: 10.11591/ijece.v6i5.10639 2415 An Effiecien Approach for Resource Auo-Scaling

More information

MyLab & Mastering Business

MyLab & Mastering Business MyLab & Masering Business Efficacy Repor 2013 MyLab & Masering: Business Efficacy Repor 2013 Edied by Michelle D. Speckler 2013 Pearson MyAccouningLab, MyEconLab, MyFinanceLab, MyMarkeingLab, and MyOMLab

More information

More Accurate Question Answering on Freebase

More Accurate Question Answering on Freebase More Accurae Quesion Answering on Freebase Hannah Bas, Elmar Haussmann Deparmen of Compuer Science Universiy of Freiburg 79110 Freiburg, Germany {bas, haussmann}@informaik.uni-freiburg.de ABSTRACT Real-world

More information

Channel Mapping using Bidirectional Long Short-Term Memory for Dereverberation in Hands-Free Voice Controlled Devices

Channel Mapping using Bidirectional Long Short-Term Memory for Dereverberation in Hands-Free Voice Controlled Devices Z. Zhang e al.: Channel Mapping using Bidirecional Long Shor-Term Memory for Dereverberaion in Hands-Free Voice Conrolled Devices 525 Channel Mapping using Bidirecional Long Shor-Term Memory for Dereverberaion

More information

1 Language universals

1 Language universals AS LX 500 Topics: Language Uniersals Fall 2010, Sepember 21 4a. Anisymmery 1 Language uniersals Subjec-erb agreemen and order Bach (1971) discusses wh-quesions across SO and SO languages, hypohesizing:...

More information

Information Propagation for informing Special Population Subgroups about New Ground Transportation Services at Airports

Information Propagation for informing Special Population Subgroups about New Ground Transportation Services at Airports Downloaded from ascelibrary.org by Basil Sephanis on 07/13/16. Copyrigh ASCE. For personal use only; all righs reserved. Informaion Propagaion for informing Special Populaion Subgroups abou New Ground

More information

Artificial Neural Networks written examination

Artificial Neural Networks written examination 1 (8) Institutionen för informationsteknologi Olle Gällmo Universitetsadjunkt Adress: Lägerhyddsvägen 2 Box 337 751 05 Uppsala Artificial Neural Networks written examination Monday, May 15, 2006 9 00-14

More information

Lecture 1: Machine Learning Basics

Lecture 1: Machine Learning Basics 1/69 Lecture 1: Machine Learning Basics Ali Harakeh University of Waterloo WAVE Lab ali.harakeh@uwaterloo.ca May 1, 2017 2/69 Overview 1 Learning Algorithms 2 Capacity, Overfitting, and Underfitting 3

More information

Learning Methods for Fuzzy Systems

Learning Methods for Fuzzy Systems Learning Methods for Fuzzy Systems Rudolf Kruse and Andreas Nürnberger Department of Computer Science, University of Magdeburg Universitätsplatz, D-396 Magdeburg, Germany Phone : +49.39.67.876, Fax : +49.39.67.8

More information

Softprop: Softmax Neural Network Backpropagation Learning

Softprop: Softmax Neural Network Backpropagation Learning Softprop: Softmax Neural Networ Bacpropagation Learning Michael Rimer Computer Science Department Brigham Young University Provo, UT 84602, USA E-mail: mrimer@axon.cs.byu.edu Tony Martinez Computer Science

More information

Knowledge Transfer in Deep Convolutional Neural Nets

Knowledge Transfer in Deep Convolutional Neural Nets Knowledge Transfer in Deep Convolutional Neural Nets Steven Gutstein, Olac Fuentes and Eric Freudenthal Computer Science Department University of Texas at El Paso El Paso, Texas, 79968, U.S.A. Abstract

More information

Introduction to Ensemble Learning Featuring Successes in the Netflix Prize Competition

Introduction to Ensemble Learning Featuring Successes in the Netflix Prize Competition Introduction to Ensemble Learning Featuring Successes in the Netflix Prize Competition Todd Holloway Two Lecture Series for B551 November 20 & 27, 2007 Indiana University Outline Introduction Bias and

More information

A Neural Network GUI Tested on Text-To-Phoneme Mapping

A Neural Network GUI Tested on Text-To-Phoneme Mapping A Neural Network GUI Tested on Text-To-Phoneme Mapping MAARTEN TROMPPER Universiteit Utrecht m.f.a.trompper@students.uu.nl Abstract Text-to-phoneme (T2P) mapping is a necessary step in any speech synthesis

More information

have to be modeled) or isolated words. Output of the system is a grapheme-tophoneme conversion system which takes as its input the spelling of words,

have to be modeled) or isolated words. Output of the system is a grapheme-tophoneme conversion system which takes as its input the spelling of words, A Language-Independent, Data-Oriented Architecture for Grapheme-to-Phoneme Conversion Walter Daelemans and Antal van den Bosch Proceedings ESCA-IEEE speech synthesis conference, New York, September 1994

More information

Machine Learning and Data Mining. Ensembles of Learners. Prof. Alexander Ihler

Machine Learning and Data Mining. Ensembles of Learners. Prof. Alexander Ihler Machine Learning and Data Mining Ensembles of Learners Prof. Alexander Ihler Ensemble methods Why learn one classifier when you can learn many? Ensemble: combine many predictors (Weighted) combina

More information

Does the Difficulty of an Interruption Affect our Ability to Resume?

Does the Difficulty of an Interruption Affect our Ability to Resume? Difficulty of Interruptions 1 Does the Difficulty of an Interruption Affect our Ability to Resume? David M. Cades Deborah A. Boehm Davis J. Gregory Trafton Naval Research Laboratory Christopher A. Monk

More information

Diagnostic Test. Middle School Mathematics

Diagnostic Test. Middle School Mathematics Diagnostic Test Middle School Mathematics Copyright 2010 XAMonline, Inc. All rights reserved. No part of the material protected by this copyright notice may be reproduced or utilized in any form or by

More information

Evolution of Symbolisation in Chimpanzees and Neural Nets

Evolution of Symbolisation in Chimpanzees and Neural Nets Evolution of Symbolisation in Chimpanzees and Neural Nets Angelo Cangelosi Centre for Neural and Adaptive Systems University of Plymouth (UK) a.cangelosi@plymouth.ac.uk Introduction Animal communication

More information

ACTIVITY: Comparing Combination Locks

ACTIVITY: Comparing Combination Locks 5.4 Compound Events outcomes of one or more events? ow can you find the number of possible ACIVIY: Comparing Combination Locks Work with a partner. You are buying a combination lock. You have three choices.

More information

Rule Learning with Negation: Issues Regarding Effectiveness

Rule Learning with Negation: Issues Regarding Effectiveness Rule Learning with Negation: Issues Regarding Effectiveness Stephanie Chua, Frans Coenen, and Grant Malcolm University of Liverpool Department of Computer Science, Ashton Building, Ashton Street, L69 3BX

More information

Rule Learning With Negation: Issues Regarding Effectiveness

Rule Learning With Negation: Issues Regarding Effectiveness Rule Learning With Negation: Issues Regarding Effectiveness S. Chua, F. Coenen, G. Malcolm University of Liverpool Department of Computer Science, Ashton Building, Ashton Street, L69 3BX Liverpool, United

More information

The Good Judgment Project: A large scale test of different methods of combining expert predictions

The Good Judgment Project: A large scale test of different methods of combining expert predictions The Good Judgment Project: A large scale test of different methods of combining expert predictions Lyle Ungar, Barb Mellors, Jon Baron, Phil Tetlock, Jaime Ramos, Sam Swift The University of Pennsylvania

More information

Lahore University of Management Sciences. FINN 321 Econometrics Fall Semester 2017

Lahore University of Management Sciences. FINN 321 Econometrics Fall Semester 2017 Instructor Syed Zahid Ali Room No. 247 Economics Wing First Floor Office Hours Email szahid@lums.edu.pk Telephone Ext. 8074 Secretary/TA TA Office Hours Course URL (if any) Suraj.lums.edu.pk FINN 321 Econometrics

More information

QuickStroke: An Incremental On-line Chinese Handwriting Recognition System

QuickStroke: An Incremental On-line Chinese Handwriting Recognition System QuickStroke: An Incremental On-line Chinese Handwriting Recognition System Nada P. Matić John C. Platt Λ Tony Wang y Synaptics, Inc. 2381 Bering Drive San Jose, CA 95131, USA Abstract This paper presents

More information

(Sub)Gradient Descent

(Sub)Gradient Descent (Sub)Gradient Descent CMSC 422 MARINE CARPUAT marine@cs.umd.edu Figures credit: Piyush Rai Logistics Midterm is on Thursday 3/24 during class time closed book/internet/etc, one page of notes. will include

More information

A Process-Model Account of Task Interruption and Resumption: When Does Encoding of the Problem State Occur?

A Process-Model Account of Task Interruption and Resumption: When Does Encoding of the Problem State Occur? A Process-Model Account of Task Interruption and Resumption: When Does Encoding of the Problem State Occur? Dario D. Salvucci Drexel University Philadelphia, PA Christopher A. Monk George Mason University

More information

Henry Tirri* Petri Myllymgki

Henry Tirri* Petri Myllymgki From: AAAI Technical Report SS-93-04. Compilation copyright 1993, AAAI (www.aaai.org). All rights reserved. Bayesian Case-Based Reasoning with Neural Networks Petri Myllymgki Henry Tirri* email: University

More information

SARDNET: A Self-Organizing Feature Map for Sequences

SARDNET: A Self-Organizing Feature Map for Sequences SARDNET: A Self-Organizing Feature Map for Sequences Daniel L. James and Risto Miikkulainen Department of Computer Sciences The University of Texas at Austin Austin, TX 78712 dljames,risto~cs.utexas.edu

More information

School Size and the Quality of Teaching and Learning

School Size and the Quality of Teaching and Learning School Size and the Quality of Teaching and Learning An Analysis of Relationships between School Size and Assessments of Factors Related to the Quality of Teaching and Learning in Primary Schools Undertaken

More information

Visual CP Representation of Knowledge

Visual CP Representation of Knowledge Visual CP Representation of Knowledge Heather D. Pfeiffer and Roger T. Hartley Department of Computer Science New Mexico State University Las Cruces, NM 88003-8001, USA email: hdp@cs.nmsu.edu and rth@cs.nmsu.edu

More information

Evolutive Neural Net Fuzzy Filtering: Basic Description

Evolutive Neural Net Fuzzy Filtering: Basic Description Journal of Intelligent Learning Systems and Applications, 2010, 2: 12-18 doi:10.4236/jilsa.2010.21002 Published Online February 2010 (http://www.scirp.org/journal/jilsa) Evolutive Neural Net Fuzzy Filtering:

More information

Problem-Solving with Toothpicks, Dots, and Coins Agenda (Target duration: 50 min.)

Problem-Solving with Toothpicks, Dots, and Coins Agenda (Target duration: 50 min.) STRUCTURED EXPERIENCE: ROLE PLAY Problem-Solving with Toothpicks, Dots, and Coins Agenda (Target duration: 50 min.) [Note: Preparation of materials should occur well before the group interview begins,

More information

The 9 th International Scientific Conference elearning and software for Education Bucharest, April 25-26, / X

The 9 th International Scientific Conference elearning and software for Education Bucharest, April 25-26, / X The 9 th International Scientific Conference elearning and software for Education Bucharest, April 25-26, 2013 10.12753/2066-026X-13-154 DATA MINING SOLUTIONS FOR DETERMINING STUDENT'S PROFILE Adela BÂRA,

More information

Classification Using ANN: A Review

Classification Using ANN: A Review International Journal of Computational Intelligence Research ISSN 0973-1873 Volume 13, Number 7 (2017), pp. 1811-1820 Research India Publications http://www.ripublication.com Classification Using ANN:

More information

Age Effects on Syntactic Control in. Second Language Learning

Age Effects on Syntactic Control in. Second Language Learning Age Effects on Syntactic Control in Second Language Learning Miriam Tullgren Loyola University Chicago Abstract 1 This paper explores the effects of age on second language acquisition in adolescents, ages

More information

Generative models and adversarial training

Generative models and adversarial training Day 4 Lecture 1 Generative models and adversarial training Kevin McGuinness kevin.mcguinness@dcu.ie Research Fellow Insight Centre for Data Analytics Dublin City University What is a generative model?

More information

Modeling function word errors in DNN-HMM based LVCSR systems

Modeling function word errors in DNN-HMM based LVCSR systems Modeling function word errors in DNN-HMM based LVCSR systems Melvin Jose Johnson Premkumar, Ankur Bapna and Sree Avinash Parchuri Department of Computer Science Department of Electrical Engineering Stanford

More information

AUTOMATIC DETECTION OF PROLONGED FRICATIVE PHONEMES WITH THE HIDDEN MARKOV MODELS APPROACH 1. INTRODUCTION

AUTOMATIC DETECTION OF PROLONGED FRICATIVE PHONEMES WITH THE HIDDEN MARKOV MODELS APPROACH 1. INTRODUCTION JOURNAL OF MEDICAL INFORMATICS & TECHNOLOGIES Vol. 11/2007, ISSN 1642-6037 Marek WIŚNIEWSKI *, Wiesława KUNISZYK-JÓŹKOWIAK *, Elżbieta SMOŁKA *, Waldemar SUSZYŃSKI * HMM, recognition, speech, disorders

More information

Reinforcement Learning by Comparing Immediate Reward

Reinforcement Learning by Comparing Immediate Reward Reinforcement Learning by Comparing Immediate Reward Punit Pandey DeepshikhaPandey Dr. Shishir Kumar Abstract This paper introduces an approach to Reinforcement Learning Algorithm by comparing their immediate

More information

Python Machine Learning

Python Machine Learning Python Machine Learning Unlock deeper insights into machine learning with this vital guide to cuttingedge predictive analytics Sebastian Raschka [ PUBLISHING 1 open source I community experience distilled

More information

Using the Attribute Hierarchy Method to Make Diagnostic Inferences about Examinees Cognitive Skills in Algebra on the SAT

Using the Attribute Hierarchy Method to Make Diagnostic Inferences about Examinees Cognitive Skills in Algebra on the SAT The Journal of Technology, Learning, and Assessment Volume 6, Number 6 February 2008 Using the Attribute Hierarchy Method to Make Diagnostic Inferences about Examinees Cognitive Skills in Algebra on the

More information

FY year and 3-year Cohort Default Rates by State and Level and Control of Institution

FY year and 3-year Cohort Default Rates by State and Level and Control of Institution Student Aid Policy Analysis FY2007 2-year and 3-year Cohort Default Rates by State and Level and Control of Institution Mark Kantrowitz Publisher of FinAid.org and FastWeb.com January 5, 2010 EXECUTIVE

More information

The Evolution of Random Phenomena

The Evolution of Random Phenomena The Evolution of Random Phenomena A Look at Markov Chains Glen Wang glenw@uchicago.edu Splash! Chicago: Winter Cascade 2012 Lecture 1: What is Randomness? What is randomness? Can you think of some examples

More information

Model Ensemble for Click Prediction in Bing Search Ads

Model Ensemble for Click Prediction in Bing Search Ads Model Ensemble for Click Prediction in Bing Search Ads Xiaoliang Ling Microsoft Bing xiaoling@microsoft.com Hucheng Zhou Microsoft Research huzho@microsoft.com Weiwei Deng Microsoft Bing dedeng@microsoft.com

More information

Conceptual and Procedural Knowledge of a Mathematics Problem: Their Measurement and Their Causal Interrelations

Conceptual and Procedural Knowledge of a Mathematics Problem: Their Measurement and Their Causal Interrelations Conceptual and Procedural Knowledge of a Mathematics Problem: Their Measurement and Their Causal Interrelations Michael Schneider (mschneider@mpib-berlin.mpg.de) Elsbeth Stern (stern@mpib-berlin.mpg.de)

More information

Improved Effects of Word-Retrieval Treatments Subsequent to Addition of the Orthographic Form

Improved Effects of Word-Retrieval Treatments Subsequent to Addition of the Orthographic Form Orthographic Form 1 Improved Effects of Word-Retrieval Treatments Subsequent to Addition of the Orthographic Form The development and testing of word-retrieval treatments for aphasia has generally focused

More information

COMPUTATIONAL COMPLEXITY OF LEFT-ASSOCIATIVE GRAMMAR

COMPUTATIONAL COMPLEXITY OF LEFT-ASSOCIATIVE GRAMMAR COMPUTATIONAL COMPLEXITY OF LEFT-ASSOCIATIVE GRAMMAR ROLAND HAUSSER Institut für Deutsche Philologie Ludwig-Maximilians Universität München München, West Germany 1. CHOICE OF A PRIMITIVE OPERATION The

More information

Class-Discriminative Weighted Distortion Measure for VQ-Based Speaker Identification

Class-Discriminative Weighted Distortion Measure for VQ-Based Speaker Identification Class-Discriminative Weighted Distortion Measure for VQ-Based Speaker Identification Tomi Kinnunen and Ismo Kärkkäinen University of Joensuu, Department of Computer Science, P.O. Box 111, 80101 JOENSUU,

More information

INPE São José dos Campos

INPE São José dos Campos INPE-5479 PRE/1778 MONLINEAR ASPECTS OF DATA INTEGRATION FOR LAND COVER CLASSIFICATION IN A NEDRAL NETWORK ENVIRONNENT Maria Suelena S. Barros Valter Rodrigues INPE São José dos Campos 1993 SECRETARIA

More information

Massachusetts Institute of Technology Tel: Massachusetts Avenue Room 32-D558 MA 02139

Massachusetts Institute of Technology Tel: Massachusetts Avenue  Room 32-D558 MA 02139 Hariharan Narayanan Massachusetts Institute of Technology Tel: 773.428.3115 LIDS har@mit.edu 77 Massachusetts Avenue http://www.mit.edu/~har Room 32-D558 MA 02139 EMPLOYMENT Massachusetts Institute of

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks Andres Chavez Math 382/L T/Th 2:00-3:40 April 13, 2010 Chavez2 Abstract The main interest of this paper is Artificial Neural Networks (ANNs). A brief history of the development

More information

Unsupervised Learning of Word Semantic Embedding using the Deep Structured Semantic Model

Unsupervised Learning of Word Semantic Embedding using the Deep Structured Semantic Model Unsupervised Learning of Word Semantic Embedding using the Deep Structured Semantic Model Xinying Song, Xiaodong He, Jianfeng Gao, Li Deng Microsoft Research, One Microsoft Way, Redmond, WA 98052, U.S.A.

More information

Lecture 10: Reinforcement Learning

Lecture 10: Reinforcement Learning Lecture 1: Reinforcement Learning Cognitive Systems II - Machine Learning SS 25 Part III: Learning Programs and Strategies Q Learning, Dynamic Programming Lecture 1: Reinforcement Learning p. Motivation

More information

When!Identifying!Contributors!is!Costly:!An! Experiment!on!Public!Goods!

When!Identifying!Contributors!is!Costly:!An! Experiment!on!Public!Goods! !! EVIDENCE-BASED RESEARCH ON CHARITABLE GIVING SPI$FUNDED$ When!Identifying!Contributors!is!Costly:!An! Experiment!on!Public!Goods! Anya!Samek,!Roman!M.!Sheremeta!! University!of!WisconsinFMadison! Case!Western!Reserve!University!&!Chapman!University!!

More information

medicaid and the How will the Medicaid Expansion for Adults Impact Eligibility and Coverage? Key Findings in Brief

medicaid and the How will the Medicaid Expansion for Adults Impact Eligibility and Coverage? Key Findings in Brief on medicaid and the uninsured July 2012 How will the Medicaid Expansion for Impact Eligibility and Coverage? Key Findings in Brief Effective January 2014, the ACA establishes a new minimum Medicaid eligibility

More information

Human Emotion Recognition From Speech

Human Emotion Recognition From Speech RESEARCH ARTICLE OPEN ACCESS Human Emotion Recognition From Speech Miss. Aparna P. Wanare*, Prof. Shankar N. Dandare *(Department of Electronics & Telecommunication Engineering, Sant Gadge Baba Amravati

More information

stateorvalue to each variable in a given set. We use p(x = xjy = y) (or p(xjy) as a shorthand) to denote the probability that X = x given Y = y. We al

stateorvalue to each variable in a given set. We use p(x = xjy = y) (or p(xjy) as a shorthand) to denote the probability that X = x given Y = y. We al Dependency Networks for Collaborative Filtering and Data Visualization David Heckerman, David Maxwell Chickering, Christopher Meek, Robert Rounthwaite, Carl Kadie Microsoft Research Redmond WA 98052-6399

More information

Analysis of Hybrid Soft and Hard Computing Techniques for Forex Monitoring Systems

Analysis of Hybrid Soft and Hard Computing Techniques for Forex Monitoring Systems Analysis of Hybrid Soft and Hard Computing Techniques for Forex Monitoring Systems Ajith Abraham School of Business Systems, Monash University, Clayton, Victoria 3800, Australia. Email: ajith.abraham@ieee.org

More information

ACTL5103 Stochastic Modelling For Actuaries. Course Outline Semester 2, 2014

ACTL5103 Stochastic Modelling For Actuaries. Course Outline Semester 2, 2014 UNSW Australia Business School School of Risk and Actuarial Studies ACTL5103 Stochastic Modelling For Actuaries Course Outline Semester 2, 2014 Part A: Course-Specific Information Please consult Part B

More information

Course Outline. Course Grading. Where to go for help. Academic Integrity. EE-589 Introduction to Neural Networks NN 1 EE

Course Outline. Course Grading. Where to go for help. Academic Integrity. EE-589 Introduction to Neural Networks NN 1 EE EE-589 Introduction to Neural Assistant Prof. Dr. Turgay IBRIKCI Room # 305 (322) 338 6868 / 139 Wensdays 9:00-12:00 Course Outline The course is divided in two parts: theory and practice. 1. Theory covers

More information

Test Effort Estimation Using Neural Network

Test Effort Estimation Using Neural Network J. Software Engineering & Applications, 2010, 3: 331-340 doi:10.4236/jsea.2010.34038 Published Online April 2010 (http://www.scirp.org/journal/jsea) 331 Chintala Abhishek*, Veginati Pavan Kumar, Harish

More information

Mathematics 112 Phone: (580) Southeastern Oklahoma State University Web: Durant, OK USA

Mathematics 112 Phone: (580) Southeastern Oklahoma State University Web:  Durant, OK USA Karl H. Frinkle Contact Information Research Interests Education Mathematics 112 Phone: (580) 745-2028 Department of Mathematics E-mail: kfrinkle@se.edu Southeastern Oklahoma State University Web: http://homepages.se.edu/kfrinkle/

More information

Dublin City Schools Mathematics Graded Course of Study GRADE 4

Dublin City Schools Mathematics Graded Course of Study GRADE 4 I. Content Standard: Number, Number Sense and Operations Standard Students demonstrate number sense, including an understanding of number systems and reasonable estimates using paper and pencil, technology-supported

More information

A cautionary note is research still caught up in an implementer approach to the teacher?

A cautionary note is research still caught up in an implementer approach to the teacher? A cautionary note is research still caught up in an implementer approach to the teacher? Jeppe Skott Växjö University, Sweden & the University of Aarhus, Denmark Abstract: In this paper I outline two historically

More information

On-Line Data Analytics

On-Line Data Analytics International Journal of Computer Applications in Engineering Sciences [VOL I, ISSUE III, SEPTEMBER 2011] [ISSN: 2231-4946] On-Line Data Analytics Yugandhar Vemulapalli #, Devarapalli Raghu *, Raja Jacob

More information

On the Combined Behavior of Autonomous Resource Management Agents

On the Combined Behavior of Autonomous Resource Management Agents On the Combined Behavior of Autonomous Resource Management Agents Siri Fagernes 1 and Alva L. Couch 2 1 Faculty of Engineering Oslo University College Oslo, Norway siri.fagernes@iu.hio.no 2 Computer Science

More information

An empirical study of learning speed in backpropagation

An empirical study of learning speed in backpropagation Carnegie Mellon University Research Showcase @ CMU Computer Science Department School of Computer Science 1988 An empirical study of learning speed in backpropagation networks Scott E. Fahlman Carnegie

More information

Psychometric Research Brief Office of Shared Accountability

Psychometric Research Brief Office of Shared Accountability August 2012 Psychometric Research Brief Office of Shared Accountability Linking Measures of Academic Progress in Mathematics and Maryland School Assessment in Mathematics Huafang Zhao, Ph.D. This brief

More information

Instructor: Mario D. Garrett, Ph.D. Phone: Office: Hepner Hall (HH) 100

Instructor: Mario D. Garrett, Ph.D.   Phone: Office: Hepner Hall (HH) 100 San Diego State University School of Social Work 610 COMPUTER APPLICATIONS FOR SOCIAL WORK PRACTICE Statistical Package for the Social Sciences Office: Hepner Hall (HH) 100 Instructor: Mario D. Garrett,

More information

CS Machine Learning

CS Machine Learning CS 478 - Machine Learning Projects Data Representation Basic testing and evaluation schemes CS 478 Data and Testing 1 Programming Issues l Program in any platform you want l Realize that you will be doing

More information

Keith Weigelt. University of Pennsylvania The Wharton School Management Department 2022 Steinberg-Dietrich Hall Philadelphia, PA (215)

Keith Weigelt. University of Pennsylvania The Wharton School Management Department 2022 Steinberg-Dietrich Hall Philadelphia, PA (215) Keith Weigelt University of Pennsylvania The Wharton School Management Department 2022 Steinberg-Dietrich Hall Philadelphia, PA 19104 (215) 898-6369 I. EDUCATIONAL BACKGROUND 1986 Ph.D. in Business Policy,

More information

Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur

Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur Module 12 Machine Learning 12.1 Instructional Objective The students should understand the concept of learning systems Students should learn about different aspects of a learning system Students should

More information

A study of speaker adaptation for DNN-based speech synthesis

A study of speaker adaptation for DNN-based speech synthesis A study of speaker adaptation for DNN-based speech synthesis Zhizheng Wu, Pawel Swietojanski, Christophe Veaux, Steve Renals, Simon King The Centre for Speech Technology Research (CSTR) University of Edinburgh,

More information

Learning Optimal Dialogue Strategies: A Case Study of a Spoken Dialogue Agent for

Learning Optimal Dialogue Strategies: A Case Study of a Spoken Dialogue Agent for Learning Optimal Dialogue Strategies: A Case Study of a Spoken Dialogue Agent for Email Marilyn A. Walker Jeanne C. Fromer Shrikanth Narayanan walker@research.att.com jeannie@ai.mit.edu shri@research.att.com

More information

Malicious User Suppression for Cooperative Spectrum Sensing in Cognitive Radio Networks using Dixon s Outlier Detection Method

Malicious User Suppression for Cooperative Spectrum Sensing in Cognitive Radio Networks using Dixon s Outlier Detection Method Malicious User Suppression for Cooperative Spectrum Sensing in Cognitive Radio Networks using Dixon s Outlier Detection Method Sanket S. Kalamkar and Adrish Banerjee Department of Electrical Engineering

More information

Learning Structural Correspondences Across Different Linguistic Domains with Synchronous Neural Language Models

Learning Structural Correspondences Across Different Linguistic Domains with Synchronous Neural Language Models Learning Structural Correspondences Across Different Linguistic Domains with Synchronous Neural Language Models Stephan Gouws and GJ van Rooyen MIH Medialab, Stellenbosch University SOUTH AFRICA {stephan,gvrooyen}@ml.sun.ac.za

More information

Deploying Agile Practices in Organizations: A Case Study

Deploying Agile Practices in Organizations: A Case Study Copyright: EuroSPI 2005, Will be presented at 9-11 November, Budapest, Hungary Deploying Agile Practices in Organizations: A Case Study Minna Pikkarainen 1, Outi Salo 1, and Jari Still 2 1 VTT Technical

More information

FUZZY EXPERT. Dr. Kasim M. Al-Aubidy. Philadelphia University. Computer Eng. Dept February 2002 University of Damascus-Syria

FUZZY EXPERT. Dr. Kasim M. Al-Aubidy. Philadelphia University. Computer Eng. Dept February 2002 University of Damascus-Syria FUZZY EXPERT SYSTEMS 16-18 18 February 2002 University of Damascus-Syria Dr. Kasim M. Al-Aubidy Computer Eng. Dept. Philadelphia University What is Expert Systems? ES are computer programs that emulate

More information

Strategies for Solving Fraction Tasks and Their Link to Algebraic Thinking

Strategies for Solving Fraction Tasks and Their Link to Algebraic Thinking Strategies for Solving Fraction Tasks and Their Link to Algebraic Thinking Catherine Pearn The University of Melbourne Max Stephens The University of Melbourne

More information

Essentials of Ability Testing. Joni Lakin Assistant Professor Educational Foundations, Leadership, and Technology

Essentials of Ability Testing. Joni Lakin Assistant Professor Educational Foundations, Leadership, and Technology Essentials of Ability Testing Joni Lakin Assistant Professor Educational Foundations, Leadership, and Technology Basic Topics Why do we administer ability tests? What do ability tests measure? How are

More information

Data Fusion Through Statistical Matching

Data Fusion Through Statistical Matching A research and education initiative at the MIT Sloan School of Management Data Fusion Through Statistical Matching Paper 185 Peter Van Der Puttan Joost N. Kok Amar Gupta January 2002 For more information,

More information

Time series prediction

Time series prediction Chapter 13 Time series prediction Amaury Lendasse, Timo Honkela, Federico Pouzols, Antti Sorjamaa, Yoan Miche, Qi Yu, Eric Severin, Mark van Heeswijk, Erkki Oja, Francesco Corona, Elia Liitiäinen, Zhanxing

More information

Learning Methods in Multilingual Speech Recognition

Learning Methods in Multilingual Speech Recognition Learning Methods in Multilingual Speech Recognition Hui Lin Department of Electrical Engineering University of Washington Seattle, WA 98125 linhui@u.washington.edu Li Deng, Jasha Droppo, Dong Yu, and Alex

More information

Using Deep Convolutional Neural Networks in Monte Carlo Tree Search

Using Deep Convolutional Neural Networks in Monte Carlo Tree Search Using Deep Convolutional Neural Networks in Monte Carlo Tree Search Tobias Graf (B) and Marco Platzner University of Paderborn, Paderborn, Germany tobiasg@mail.upb.de, platzner@upb.de Abstract. Deep Convolutional

More information

The Use of Statistical, Computational and Modelling Tools in Higher Learning Institutions: A Case Study of the University of Dodoma

The Use of Statistical, Computational and Modelling Tools in Higher Learning Institutions: A Case Study of the University of Dodoma International Journal of Computer Applications (975 8887) The Use of Statistical, Computational and Modelling Tools in Higher Learning Institutions: A Case Study of the University of Dodoma Gilbert M.

More information

MABEL ABRAHAM. 710 Uris Hall Broadway mabelabraham.com New York, New York Updated January 2017 EMPLOYMENT

MABEL ABRAHAM. 710 Uris Hall Broadway mabelabraham.com New York, New York Updated January 2017 EMPLOYMENT MABEL ABRAHAM Columbia Business School mabel.abraham@columbia.edu 710 Uris Hall 212-854-7788 3022 Broadway mabelabraham.com New York, New York 10027 Updated January 2017 EMPLOYMENT 2015 Columbia University,

More information

TABLE OF CONTENTS TABLE OF CONTENTS COVER PAGE HALAMAN PENGESAHAN PERNYATAAN NASKAH SOAL TUGAS AKHIR ACKNOWLEDGEMENT FOREWORD

TABLE OF CONTENTS TABLE OF CONTENTS COVER PAGE HALAMAN PENGESAHAN PERNYATAAN NASKAH SOAL TUGAS AKHIR ACKNOWLEDGEMENT FOREWORD TABLE OF CONTENTS TABLE OF CONTENTS COVER PAGE HALAMAN PENGESAHAN PERNYATAAN NASKAH SOAL TUGAS AKHIR ACKNOWLEDGEMENT FOREWORD TABLE OF CONTENTS LIST OF FIGURES LIST OF TABLES LIST OF APPENDICES LIST OF

More information

A Coding System for Dynamic Topic Analysis: A Computer-Mediated Discourse Analysis Technique

A Coding System for Dynamic Topic Analysis: A Computer-Mediated Discourse Analysis Technique A Coding System for Dynamic Topic Analysis: A Computer-Mediated Discourse Analysis Technique Hiromi Ishizaki 1, Susan C. Herring 2, Yasuhiro Takishima 1 1 KDDI R&D Laboratories, Inc. 2 Indiana University

More information

Arizona s College and Career Ready Standards Mathematics

Arizona s College and Career Ready Standards Mathematics Arizona s College and Career Ready Mathematics Mathematical Practices Explanations and Examples First Grade ARIZONA DEPARTMENT OF EDUCATION HIGH ACADEMIC STANDARDS FOR STUDENTS State Board Approved June

More information

Women in Orthopaedic Fellowships: What Is Their Match Rate, and What Specialties Do They Choose?

Women in Orthopaedic Fellowships: What Is Their Match Rate, and What Specialties Do They Choose? Clin Orthop Relat Res (2016) 474:1957 1961 DOI 10.1007/s11999-016-4829-9 Clinical Orthopaedics and Related Research A Publication of The Association of Bone and Joint Surgeons SYMPOSIUM: WOMEN AND UNDERREPRESENTED

More information

Trends in College Pricing

Trends in College Pricing Trends in College Pricing 2009 T R E N D S I N H I G H E R E D U C A T I O N S E R I E S T R E N D S I N H I G H E R E D U C A T I O N S E R I E S Highlights Published Tuition and Fee and Room and Board

More information

An Empirical and Computational Test of Linguistic Relativity

An Empirical and Computational Test of Linguistic Relativity An Empirical and Computational Test of Linguistic Relativity Kathleen M. Eberhard* (eberhard.1@nd.edu) Matthias Scheutz** (mscheutz@cse.nd.edu) Michael Heilman** (mheilman@nd.edu) *Department of Psychology,

More information

Physics 270: Experimental Physics

Physics 270: Experimental Physics 2017 edition Lab Manual Physics 270 3 Physics 270: Experimental Physics Lecture: Lab: Instructor: Office: Email: Tuesdays, 2 3:50 PM Thursdays, 2 4:50 PM Dr. Uttam Manna 313C Moulton Hall umanna@ilstu.edu

More information

Reduce the Failure Rate of the Screwing Process with Six Sigma Approach

Reduce the Failure Rate of the Screwing Process with Six Sigma Approach Proceedings of the 2014 International Conference on Industrial Engineering and Operations Management Bali, Indonesia, January 7 9, 2014 Reduce the Failure Rate of the Screwing Process with Six Sigma Approach

More information

THE INFORMATION SYSTEMS ANALYST EXAM AS A PROGRAM ASSESSMENT TOOL: PRE-POST TESTS AND COMPARISON TO THE MAJOR FIELD TEST

THE INFORMATION SYSTEMS ANALYST EXAM AS A PROGRAM ASSESSMENT TOOL: PRE-POST TESTS AND COMPARISON TO THE MAJOR FIELD TEST THE INFORMATION SYSTEMS ANALYST EXAM AS A PROGRAM ASSESSMENT TOOL: PRE-POST TESTS AND COMPARISON TO THE MAJOR FIELD TEST Donald A. Carpenter, Mesa State College, dcarpent@mesastate.edu Morgan K. Bridge,

More information

Improving Conceptual Understanding of Physics with Technology

Improving Conceptual Understanding of Physics with Technology INTRODUCTION Improving Conceptual Understanding of Physics with Technology Heidi Jackman Research Experience for Undergraduates, 1999 Michigan State University Advisors: Edwin Kashy and Michael Thoennessen

More information

College Pricing. Ben Johnson. April 30, Abstract. Colleges in the United States price discriminate based on student characteristics

College Pricing. Ben Johnson. April 30, Abstract. Colleges in the United States price discriminate based on student characteristics College Pricing Ben Johnson April 30, 2012 Abstract Colleges in the United States price discriminate based on student characteristics such as ability and income. This paper develops a model of college

More information

ISFA2008U_120 A SCHEDULING REINFORCEMENT LEARNING ALGORITHM

ISFA2008U_120 A SCHEDULING REINFORCEMENT LEARNING ALGORITHM Proceedings of 28 ISFA 28 International Symposium on Flexible Automation Atlanta, GA, USA June 23-26, 28 ISFA28U_12 A SCHEDULING REINFORCEMENT LEARNING ALGORITHM Amit Gil, Helman Stern, Yael Edan, and

More information