Econometrics

View this topic in
In economics, econometrics has been described broadly as the discipline that "aim[s s] to give empirical content to economic relations. (Wikipedia.org)






Conferences related to Econometrics

Back to Top

2013 9th International Conference on Wireless Communications, Networking and Mobile Computing (WiCOM)

All areas related to wireless communications, network technologies, and mobile computing systems.


2012 3rd International Conference on E-Business and E-Government (ICEE)

ICEE 2012 aims to provide a high-level international forum for researchers and engineers to present and discuss recent advances and new techniques in E-Business,E-Commerce and E-Government.

  • 2011 International Conference on E-Business and E-Government (ICEE)

    E-Business and E-Commerce,E-Government, Engineering Management, Service Management & Knowledge Management

  • 2010 International Conference on E-Business and E-Government (ICEE)

    E-Business and E-Commerce,E-Government, E-education,Engineering Management, Service Management & Knowledge Management


2012 7th International Conference on Computer Science & Education (ICCSE 2012)

Annual selected topics including cloud computing, internet of things, recommendation system, green communication and web application etc. It also covers the topics of interaction between engineering education and computer science.


2012 International Conference on Management and Service Science (MASS 2012)

Enterprise Management, Engineering Management, Service Science, Financial Management, Knowledge Management


2011 3rd International Workshop on Education Technology and Computer Science (ETCS)

ETCS2011 will be held on 12-13 March, 2011 in Wuhan, China. ETCS2009 and ETCS2010 have been indexed by EI Compendex and ISTP. The ETCS2011 will bring together the top researchers from Asian Pacific nations, North America, Europe and around the world to exchange their research results and address open issues in Education Technology and Computer Science.

  • 2010 2nd International Workshop on Education Technology and Computer Science (ETCS)

    The 2nd International Workshop on Education Technology and Computer Science (ETCS2010) will be held on 6-7 March, 2010 in Wuhan, China. The workshop will bring together the top researchers from Asian Pacific nations, North America, Europe and around the world to exchange their research results and address open issues in Education Technology and Computer Science.

  • 2009 First International Workshop on Education Technology and Computer Science (ETCS)

    ETCS2009 is organized by Huazhong Universiy of Science and Technology, Harbin Institute of Technology,and supported by IEEE Harbin Section, Huazhong Normal University,Wuhan University. ETCS 2009 will be convened in the magnificent cultural city - Wuhan, China. ETCS 2009 will bring experts from several continents to give presentations, exchange information and learn about the latest developments in the field of education and computer science.


More Conferences

Periodicals related to Econometrics

Back to Top

Automatic Control, IEEE Transactions on

The theory, design and application of Control Systems. It shall encompass components, and the integration of these components, as are necessary for the construction of such systems. The word `systems' as used herein shall be interpreted to include physical, biological, organizational and other entities and combinations thereof, which can be represented through a mathematical symbolism. The Field of Interest: shall ...


Security & Privacy, IEEE

IEEE Security & Privacy seeks to stimulate and track advances in security, privacy, and dependability and present these advances for a broad cross-section of academic researchers and industry practitioners. IEEE Security & Privacy aims to provide a unique combination of research articles, case studies, tutorials, and regular departments covering diverse aspects of security and dependability of computer-based systems, including legal ...




Xplore Articles related to Econometrics

Back to Top

Forecast and analysis of the soybean import in China

Jun Yang; Tianxiang Yao; Rong Zhang 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583), 2004

Soybean is an important crop in China. Before 1996, China was a net exporter of soybean, however the situation changed and the import has increased dramatically since then. The import in 2003 exceeded the total domestic soybean production in 2002. The surprising increase concerns many scholars and policy-makers in China. Up to now, the major researches are qualitative analysis. The ...


A novice's guide to micro Monte Carlo [load forecasting]

D. T. Kexel 1988. Rural Electric Power Conference, Papers Presented at the 32nd Annual Conference, 1988

The use of Monte Carlo methods to develop probability distributions of electric load forecasts is described. The methods can be used with either end- use or econometric forecasts and can be implemented on a microcomputer using LOTUS 1-2-3. The concept, forecasting model construction, probability distribution selection, random number generation, and evaluation of the forecast distribution are discussed


Scheduling preventive railway maintenance activities

G. Budai; D. Huisman; R. Dekker 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583), 2004

We provide some useful methods for finding optimal track possession intervals for carrying art preventive maintenance works in such a way that the inconvenience for the train operators and the infrastructure possession time is minimised. In our models mostly the train free periods are used for carrying out routine maintenance works, namely the hours between two consecutive train operations. Moreover, ...


Features with fuzzy probability

A. Pieczyriski; S. Robak; A. Walaszek-Babiszewska Proceedings. 11th IEEE International Conference and Workshop on the Engineering of Computer-Based Systems, 2004., 2004

This paper proposes an expert system description based on a feature diagram tree, annotated with weighted variant features in the software family context. The principle of how some variant features may be described on the basis of fuzzy logic is presented and fuzzy probability of market conditions is introduced. The proposed description integrates the external cross-tree constraints, and the resulting ...


Estimating the Effects of Climatic Change on Grain Production: Spatial Versus Non-Spatial Models

Wei Huang; Xiangzheng Deng; Jinyan Zhan; Yingzhi Lin 2009 2nd International Conference on Biomedical Engineering and Informatics, 2009

This paper aims to estimate the effects of climatic change on grain production in China by doing a cross-sectional analysis based on county-level dataset with variables measuring the total output of grain production, climate, and other economic and geographical data for over 2200 counties of China in the year of 2000. A non-spatial model, using Ordinary Least Squares(OLS) approach, was ...


More Xplore Articles

Educational Resources on Econometrics

Back to Top

eLearning

Forecast and analysis of the soybean import in China

Jun Yang; Tianxiang Yao; Rong Zhang 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583), 2004

Soybean is an important crop in China. Before 1996, China was a net exporter of soybean, however the situation changed and the import has increased dramatically since then. The import in 2003 exceeded the total domestic soybean production in 2002. The surprising increase concerns many scholars and policy-makers in China. Up to now, the major researches are qualitative analysis. The ...


A novice's guide to micro Monte Carlo [load forecasting]

D. T. Kexel 1988. Rural Electric Power Conference, Papers Presented at the 32nd Annual Conference, 1988

The use of Monte Carlo methods to develop probability distributions of electric load forecasts is described. The methods can be used with either end- use or econometric forecasts and can be implemented on a microcomputer using LOTUS 1-2-3. The concept, forecasting model construction, probability distribution selection, random number generation, and evaluation of the forecast distribution are discussed


Scheduling preventive railway maintenance activities

G. Budai; D. Huisman; R. Dekker 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583), 2004

We provide some useful methods for finding optimal track possession intervals for carrying art preventive maintenance works in such a way that the inconvenience for the train operators and the infrastructure possession time is minimised. In our models mostly the train free periods are used for carrying out routine maintenance works, namely the hours between two consecutive train operations. Moreover, ...


Features with fuzzy probability

A. Pieczyriski; S. Robak; A. Walaszek-Babiszewska Proceedings. 11th IEEE International Conference and Workshop on the Engineering of Computer-Based Systems, 2004., 2004

This paper proposes an expert system description based on a feature diagram tree, annotated with weighted variant features in the software family context. The principle of how some variant features may be described on the basis of fuzzy logic is presented and fuzzy probability of market conditions is introduced. The proposed description integrates the external cross-tree constraints, and the resulting ...


Estimating the Effects of Climatic Change on Grain Production: Spatial Versus Non-Spatial Models

Wei Huang; Xiangzheng Deng; Jinyan Zhan; Yingzhi Lin 2009 2nd International Conference on Biomedical Engineering and Informatics, 2009

This paper aims to estimate the effects of climatic change on grain production in China by doing a cross-sectional analysis based on county-level dataset with variables measuring the total output of grain production, climate, and other economic and geographical data for over 2200 counties of China in the year of 2000. A non-spatial model, using Ordinary Least Squares(OLS) approach, was ...


More eLearning Resources

IEEE.tv Videos

No IEEE.tv Videos are currently tagged "Econometrics"

IEEE-USA E-Books

  • Contributors

    Simulation modeling is increasingly integrated into research and policy analysis of complex sociotechnical systems in a variety of domains. Model- based analysis and policy design inform a range of applications in fields from economics to engineering to health care. This book offers a hands-on introduction to key analytical methods for dynamic modeling. Bringing together tools and methodologies from fields as diverse as computational statistics, econometrics, and operations research in a single text, the book can be used for graduate-level courses and as a reference for dynamic modelers who want to expand their methodological toolbox. The focus is on quantitative techniques for use by dynamic modelers during model construction and analysis, and the material presented is accessible to readers with a background in college-level calculus and statistics. Each chapter describes a key method, presenting an introduction that emphasizes the basic intuition behind each method, tutorial style examples, references to key literature, and exercises. The chapter authors are all experts in the tools and methods they present. The book covers estimation of model parameters using quantitative data; understanding the links between model structure and its behavior; and decision support and optimization. An online appendix offers computer code for applications, models, and solutions to exercises. **Contributors**Wenyi An, Edward G. Anderson Jr., Yaman Barlas, Nishesh Chalise, Robert Eberlein, Hamed Ghoddusi, Winfried Grassmann, Peter S. Hovmand, Mohammad S. Jalali, Nitin Joglekar, David Keith, Juxin Liu, Erling Moxnes, Rogelio Oliva, Nathaniel D. Osgood, Hazhir Rahmandad, Raymond Spiteri, John Sterman, Jeroen Struben, Burcu Tan, Karen Yee, G??nen?? Y??cel

  • Subject Index

    The minimum description length (MDL) principle is a powerful method of inductive inference, the basis of statistical modeling, pattern recognition, and machine learning. It holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data. MDL methods are particularly well-suited for dealing with model selection, prediction, and estimation problems in situations where the models under consideration can be arbitrarily complex, and overfitting the data is a serious concern.This extensive, step-by-step introduction to the MDL Principle provides a comprehensive reference (with an emphasis on conceptual issues) that is accessible to graduate students and researchers in statistics, pattern classification, machine learning, and data mining, to philosophers interested in the foundations of statistics, and to researchers in other applied sciences that involve model selection, including biology, econometrics, and experimental psychology. Part I provides a basic introduction to MDL and an overview of the concepts in statistics and information theory needed to understand MDL. Part II treats universal coding, the information-theoretic notion on which MDL is built, and part III gives a formal treatment of MDL theory as a theory of inductive inference based on universal coding. Part IV provides a comprehensive overview of the statistical theory of exponential families with an emphasis on their information-theoretic properties. The text includes a number of summaries, paragraphs offering the reader a "fast track" through the material, and boxes highlighting the most important concepts.

  • Additional Background

    The minimum description length (MDL) principle is a powerful method of inductive inference, the basis of statistical modeling, pattern recognition, and machine learning. It holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data. MDL methods are particularly well-suited for dealing with model selection, prediction, and estimation problems in situations where the models under consideration can be arbitrarily complex, and overfitting the data is a serious concern.This extensive, step-by-step introduction to the MDL Principle provides a comprehensive reference (with an emphasis on conceptual issues) that is accessible to graduate students and researchers in statistics, pattern classification, machine learning, and data mining, to philosophers interested in the foundations of statistics, and to researchers in other applied sciences that involve model selection, including biology, econometrics, and experimental psychology. Part I provides a basic introduction to MDL and an overview of the concepts in statistics and information theory needed to understand MDL. Part II treats universal coding, the information-theoretic notion on which MDL is built, and part III gives a formal treatment of MDL theory as a theory of inductive inference based on universal coding. Part IV provides a comprehensive overview of the statistical theory of exponential families with an emphasis on their information-theoretic properties. The text includes a number of summaries, paragraphs offering the reader a "fast track" through the material, and boxes highlighting the most important concepts.

  • Index

    Simulation modeling is increasingly integrated into research and policy analysis of complex sociotechnical systems in a variety of domains. Model- based analysis and policy design inform a range of applications in fields from economics to engineering to health care. This book offers a hands-on introduction to key analytical methods for dynamic modeling. Bringing together tools and methodologies from fields as diverse as computational statistics, econometrics, and operations research in a single text, the book can be used for graduate-level courses and as a reference for dynamic modelers who want to expand their methodological toolbox. The focus is on quantitative techniques for use by dynamic modelers during model construction and analysis, and the material presented is accessible to readers with a background in college-level calculus and statistics. Each chapter describes a key method, presenting an introduction that emphasizes the basic intuition behind each method, tutorial style examples, references to key literature, and exercises. The chapter authors are all experts in the tools and methods they present. The book covers estimation of model parameters using quantitative data; understanding the links between model structure and its behavior; and decision support and optimization. An online appendix offers computer code for applications, models, and solutions to exercises. **Contributors**Wenyi An, Edward G. Anderson Jr., Yaman Barlas, Nishesh Chalise, Robert Eberlein, Hamed Ghoddusi, Winfried Grassmann, Peter S. Hovmand, Mohammad S. Jalali, Nitin Joglekar, David Keith, Juxin Liu, Erling Moxnes, Rogelio Oliva, Nathaniel D. Osgood, Hazhir Rahmandad, Raymond Spiteri, John Sterman, Jeroen Struben, Burcu Tan, Karen Yee, G??nen?? Y??cel

  • Decision Support and Optimization

    Simulation modeling is increasingly integrated into research and policy analysis of complex sociotechnical systems in a variety of domains. Model- based analysis and policy design inform a range of applications in fields from economics to engineering to health care. This book offers a hands-on introduction to key analytical methods for dynamic modeling. Bringing together tools and methodologies from fields as diverse as computational statistics, econometrics, and operations research in a single text, the book can be used for graduate-level courses and as a reference for dynamic modelers who want to expand their methodological toolbox. The focus is on quantitative techniques for use by dynamic modelers during model construction and analysis, and the material presented is accessible to readers with a background in college-level calculus and statistics. Each chapter describes a key method, presenting an introduction that emphasizes the basic intuition behind each method, tutorial style examples, references to key literature, and exercises. The chapter authors are all experts in the tools and methods they present. The book covers estimation of model parameters using quantitative data; understanding the links between model structure and its behavior; and decision support and optimization. An online appendix offers computer code for applications, models, and solutions to exercises. **Contributors**Wenyi An, Edward G. Anderson Jr., Yaman Barlas, Nishesh Chalise, Robert Eberlein, Hamed Ghoddusi, Winfried Grassmann, Peter S. Hovmand, Mohammad S. Jalali, Nitin Joglekar, David Keith, Juxin Liu, Erling Moxnes, Rogelio Oliva, Nathaniel D. Osgood, Hazhir Rahmandad, Raymond Spiteri, John Sterman, Jeroen Struben, Burcu Tan, Karen Yee, G??nen?? Y??cel

  • List of Symbols

    The minimum description length (MDL) principle is a powerful method of inductive inference, the basis of statistical modeling, pattern recognition, and machine learning. It holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data. MDL methods are particularly well-suited for dealing with model selection, prediction, and estimation problems in situations where the models under consideration can be arbitrarily complex, and overfitting the data is a serious concern.This extensive, step-by-step introduction to the MDL Principle provides a comprehensive reference (with an emphasis on conceptual issues) that is accessible to graduate students and researchers in statistics, pattern classification, machine learning, and data mining, to philosophers interested in the foundations of statistics, and to researchers in other applied sciences that involve model selection, including biology, econometrics, and experimental psychology. Part I provides a basic introduction to MDL and an overview of the concepts in statistics and information theory needed to understand MDL. Part II treats universal coding, the information-theoretic notion on which MDL is built, and part III gives a formal treatment of MDL theory as a theory of inductive inference based on universal coding. Part IV provides a comprehensive overview of the statistical theory of exponential families with an emphasis on their information-theoretic properties. The text includes a number of summaries, paragraphs offering the reader a "fast track" through the material, and boxes highlighting the most important concepts.

  • References

    The minimum description length (MDL) principle is a powerful method of inductive inference, the basis of statistical modeling, pattern recognition, and machine learning. It holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data. MDL methods are particularly well-suited for dealing with model selection, prediction, and estimation problems in situations where the models under consideration can be arbitrarily complex, and overfitting the data is a serious concern.This extensive, step-by-step introduction to the MDL Principle provides a comprehensive reference (with an emphasis on conceptual issues) that is accessible to graduate students and researchers in statistics, pattern classification, machine learning, and data mining, to philosophers interested in the foundations of statistics, and to researchers in other applied sciences that involve model selection, including biology, econometrics, and experimental psychology. Part I provides a basic introduction to MDL and an overview of the concepts in statistics and information theory needed to understand MDL. Part II treats universal coding, the information-theoretic notion on which MDL is built, and part III gives a formal treatment of MDL theory as a theory of inductive inference based on universal coding. Part IV provides a comprehensive overview of the statistical theory of exponential families with an emphasis on their information-theoretic properties. The text includes a number of summaries, paragraphs offering the reader a "fast track" through the material, and boxes highlighting the most important concepts.

  • Estimation of Model Parameters

    Simulation modeling is increasingly integrated into research and policy analysis of complex sociotechnical systems in a variety of domains. Model- based analysis and policy design inform a range of applications in fields from economics to engineering to health care. This book offers a hands-on introduction to key analytical methods for dynamic modeling. Bringing together tools and methodologies from fields as diverse as computational statistics, econometrics, and operations research in a single text, the book can be used for graduate-level courses and as a reference for dynamic modelers who want to expand their methodological toolbox. The focus is on quantitative techniques for use by dynamic modelers during model construction and analysis, and the material presented is accessible to readers with a background in college-level calculus and statistics. Each chapter describes a key method, presenting an introduction that emphasizes the basic intuition behind each method, tutorial style examples, references to key literature, and exercises. The chapter authors are all experts in the tools and methods they present. The book covers estimation of model parameters using quantitative data; understanding the links between model structure and its behavior; and decision support and optimization. An online appendix offers computer code for applications, models, and solutions to exercises. **Contributors**Wenyi An, Edward G. Anderson Jr., Yaman Barlas, Nishesh Chalise, Robert Eberlein, Hamed Ghoddusi, Winfried Grassmann, Peter S. Hovmand, Mohammad S. Jalali, Nitin Joglekar, David Keith, Juxin Liu, Erling Moxnes, Rogelio Oliva, Nathaniel D. Osgood, Hazhir Rahmandad, Raymond Spiteri, John Sterman, Jeroen Struben, Burcu Tan, Karen Yee, G??nen?? Y??cel

  • Model Analysis

    Simulation modeling is increasingly integrated into research and policy analysis of complex sociotechnical systems in a variety of domains. Model- based analysis and policy design inform a range of applications in fields from economics to engineering to health care. This book offers a hands-on introduction to key analytical methods for dynamic modeling. Bringing together tools and methodologies from fields as diverse as computational statistics, econometrics, and operations research in a single text, the book can be used for graduate-level courses and as a reference for dynamic modelers who want to expand their methodological toolbox. The focus is on quantitative techniques for use by dynamic modelers during model construction and analysis, and the material presented is accessible to readers with a background in college-level calculus and statistics. Each chapter describes a key method, presenting an introduction that emphasizes the basic intuition behind each method, tutorial style examples, references to key literature, and exercises. The chapter authors are all experts in the tools and methods they present. The book covers estimation of model parameters using quantitative data; understanding the links between model structure and its behavior; and decision support and optimization. An online appendix offers computer code for applications, models, and solutions to exercises. **Contributors**Wenyi An, Edward G. Anderson Jr., Yaman Barlas, Nishesh Chalise, Robert Eberlein, Hamed Ghoddusi, Winfried Grassmann, Peter S. Hovmand, Mohammad S. Jalali, Nitin Joglekar, David Keith, Juxin Liu, Erling Moxnes, Rogelio Oliva, Nathaniel D. Osgood, Hazhir Rahmandad, Raymond Spiteri, John Sterman, Jeroen Struben, Burcu Tan, Karen Yee, G??nen?? Y??cel

  • Introductory Material

    The minimum description length (MDL) principle is a powerful method of inductive inference, the basis of statistical modeling, pattern recognition, and machine learning. It holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data. MDL methods are particularly well-suited for dealing with model selection, prediction, and estimation problems in situations where the models under consideration can be arbitrarily complex, and overfitting the data is a serious concern.This extensive, step-by-step introduction to the MDL Principle provides a comprehensive reference (with an emphasis on conceptual issues) that is accessible to graduate students and researchers in statistics, pattern classification, machine learning, and data mining, to philosophers interested in the foundations of statistics, and to researchers in other applied sciences that involve model selection, including biology, econometrics, and experimental psychology. Part I provides a basic introduction to MDL and an overview of the concepts in statistics and information theory needed to understand MDL. Part II treats universal coding, the information-theoretic notion on which MDL is built, and part III gives a formal treatment of MDL theory as a theory of inductive inference based on universal coding. Part IV provides a comprehensive overview of the statistical theory of exponential families with an emphasis on their information-theoretic properties. The text includes a number of summaries, paragraphs offering the reader a "fast track" through the material, and boxes highlighting the most important concepts.



Standards related to Econometrics

Back to Top

No standards are currently tagged "Econometrics"


Jobs related to Econometrics

Back to Top