Econometrics

View this topic in
In economics, econometrics has been described broadly as the discipline that "aim[s s] to give empirical content to economic relations. (Wikipedia.org)






Conferences related to Econometrics

Back to Top

2013 9th International Conference on Wireless Communications, Networking and Mobile Computing (WiCOM)

All areas related to wireless communications, network technologies, and mobile computing systems.


2012 3rd International Conference on E-Business and E-Government (ICEE)

ICEE 2012 aims to provide a high-level international forum for researchers and engineers to present and discuss recent advances and new techniques in E-Business,E-Commerce and E-Government.

  • 2011 International Conference on E-Business and E-Government (ICEE)

    E-Business and E-Commerce,E-Government, Engineering Management, Service Management & Knowledge Management

  • 2010 International Conference on E-Business and E-Government (ICEE)

    E-Business and E-Commerce,E-Government, E-education,Engineering Management, Service Management & Knowledge Management


2012 7th International Conference on Computer Science & Education (ICCSE 2012)

Annual selected topics including cloud computing, internet of things, recommendation system, green communication and web application etc. It also covers the topics of interaction between engineering education and computer science.


2012 International Conference on Management and Service Science (MASS 2012)

Enterprise Management, Engineering Management, Service Science, Financial Management, Knowledge Management


2011 3rd International Workshop on Education Technology and Computer Science (ETCS)

ETCS2011 will be held on 12-13 March, 2011 in Wuhan, China. ETCS2009 and ETCS2010 have been indexed by EI Compendex and ISTP. The ETCS2011 will bring together the top researchers from Asian Pacific nations, North America, Europe and around the world to exchange their research results and address open issues in Education Technology and Computer Science.

  • 2010 2nd International Workshop on Education Technology and Computer Science (ETCS)

    The 2nd International Workshop on Education Technology and Computer Science (ETCS2010) will be held on 6-7 March, 2010 in Wuhan, China. The workshop will bring together the top researchers from Asian Pacific nations, North America, Europe and around the world to exchange their research results and address open issues in Education Technology and Computer Science.

  • 2009 First International Workshop on Education Technology and Computer Science (ETCS)

    ETCS2009 is organized by Huazhong Universiy of Science and Technology, Harbin Institute of Technology,and supported by IEEE Harbin Section, Huazhong Normal University,Wuhan University. ETCS 2009 will be convened in the magnificent cultural city - Wuhan, China. ETCS 2009 will bring experts from several continents to give presentations, exchange information and learn about the latest developments in the field of education and computer science.


More Conferences

Periodicals related to Econometrics

Back to Top

Automatic Control, IEEE Transactions on

The theory, design and application of Control Systems. It shall encompass components, and the integration of these components, as are necessary for the construction of such systems. The word `systems' as used herein shall be interpreted to include physical, biological, organizational and other entities and combinations thereof, which can be represented through a mathematical symbolism. The Field of Interest: shall ...


Security & Privacy, IEEE

IEEE Security & Privacy seeks to stimulate and track advances in security, privacy, and dependability and present these advances for a broad cross-section of academic researchers and industry practitioners. IEEE Security & Privacy aims to provide a unique combination of research articles, case studies, tutorials, and regular departments covering diverse aspects of security and dependability of computer-based systems, including legal ...




Xplore Articles related to Econometrics

Back to Top

Features with fuzzy probability

A. Pieczyriski; S. Robak; A. Walaszek-Babiszewska Proceedings. 11th IEEE International Conference and Workshop on the Engineering of Computer-Based Systems, 2004., 2004

This paper proposes an expert system description based on a feature diagram tree, annotated with weighted variant features in the software family context. The principle of how some variant features may be described on the basis of fuzzy logic is presented and fuzzy probability of market conditions is introduced. The proposed description integrates the external cross-tree constraints, and the resulting ...


A study on relationship between regional financial assets structure and economic growth in Zhejiang

Hong Zhou; Lan Tan; Yan Chen; Liqin Wang 2010 2nd IEEE International Conference on Information Management and Engineering, 2010

Regional financial development and economic growth are closely related. The financial volume growth is important, but if not coordinated with the structure of financial assets, financial development will not be sustained, and thus also affect the long-term economic growth. This paper, starting from a regional financial asset structure, combined with Harrod-Domar model and the Tobin model of monetary assets, established ...


Research on Relevancy between Circulation Industry and National Economy Based on Canonical Correlation Analysis

Jing Cao 2010 International Conference on Internet Technology and Applications, 2010

Circulation industry has already become an important part of national economy of China. Applying the method of canonical correlation analysis, this paper studies the relevancy between circulation industry and the others of national economy of China empirically. The author finds that circulation industry has strong pulling effects on other industries of national economy and that GDP of national economy increases ...


Estimating future demand a top down/bottom up approach for forecasting annual growths

C. A. Dortolina; R. Nadira IEEE Power Engineering Society General Meeting, 2005, 2005

The accurate estimation of future demand growth in power systems has important technical, economic, and regulatory repercussions. For example, future demand is a key input for regulators before approving future capital expenditures. Similarly, accurate estimation of future demand is extremely valuable when validating configuration of the electricity sector prior to ring-fencing and/or privatization. This paper proposes a top-down/bottom-up approach for ...


Energy Efficiency Benefits: Is Technophilic Optimism Justified?

Rohit Nishant; Thompson S. H. Teo; Mark Goh IEEE Transactions on Engineering Management, 2014

Despite the increased focus on making energy more efficient, research has rarely examined the temporal impact of energy efficiency on environmental performance at the national level. Using archival data, we conduct an econometric analysis that provides empirical evidence indicating the negative effects of the rebound effect and the positive effects of technology. Our results suggest that an increase in per ...


More Xplore Articles

Educational Resources on Econometrics

Back to Top

eLearning

Features with fuzzy probability

A. Pieczyriski; S. Robak; A. Walaszek-Babiszewska Proceedings. 11th IEEE International Conference and Workshop on the Engineering of Computer-Based Systems, 2004., 2004

This paper proposes an expert system description based on a feature diagram tree, annotated with weighted variant features in the software family context. The principle of how some variant features may be described on the basis of fuzzy logic is presented and fuzzy probability of market conditions is introduced. The proposed description integrates the external cross-tree constraints, and the resulting ...


A study on relationship between regional financial assets structure and economic growth in Zhejiang

Hong Zhou; Lan Tan; Yan Chen; Liqin Wang 2010 2nd IEEE International Conference on Information Management and Engineering, 2010

Regional financial development and economic growth are closely related. The financial volume growth is important, but if not coordinated with the structure of financial assets, financial development will not be sustained, and thus also affect the long-term economic growth. This paper, starting from a regional financial asset structure, combined with Harrod-Domar model and the Tobin model of monetary assets, established ...


Research on Relevancy between Circulation Industry and National Economy Based on Canonical Correlation Analysis

Jing Cao 2010 International Conference on Internet Technology and Applications, 2010

Circulation industry has already become an important part of national economy of China. Applying the method of canonical correlation analysis, this paper studies the relevancy between circulation industry and the others of national economy of China empirically. The author finds that circulation industry has strong pulling effects on other industries of national economy and that GDP of national economy increases ...


Estimating future demand a top down/bottom up approach for forecasting annual growths

C. A. Dortolina; R. Nadira IEEE Power Engineering Society General Meeting, 2005, 2005

The accurate estimation of future demand growth in power systems has important technical, economic, and regulatory repercussions. For example, future demand is a key input for regulators before approving future capital expenditures. Similarly, accurate estimation of future demand is extremely valuable when validating configuration of the electricity sector prior to ring-fencing and/or privatization. This paper proposes a top-down/bottom-up approach for ...


Energy Efficiency Benefits: Is Technophilic Optimism Justified?

Rohit Nishant; Thompson S. H. Teo; Mark Goh IEEE Transactions on Engineering Management, 2014

Despite the increased focus on making energy more efficient, research has rarely examined the temporal impact of energy efficiency on environmental performance at the national level. Using archival data, we conduct an econometric analysis that provides empirical evidence indicating the negative effects of the rebound effect and the positive effects of technology. Our results suggest that an increase in per ...


More eLearning Resources

IEEE.tv Videos

No IEEE.tv Videos are currently tagged "Econometrics"

IEEE-USA E-Books

  • Model Analysis

    Simulation modeling is increasingly integrated into research and policy analysis of complex sociotechnical systems in a variety of domains. Model- based analysis and policy design inform a range of applications in fields from economics to engineering to health care. This book offers a hands-on introduction to key analytical methods for dynamic modeling. Bringing together tools and methodologies from fields as diverse as computational statistics, econometrics, and operations research in a single text, the book can be used for graduate-level courses and as a reference for dynamic modelers who want to expand their methodological toolbox. The focus is on quantitative techniques for use by dynamic modelers during model construction and analysis, and the material presented is accessible to readers with a background in college-level calculus and statistics. Each chapter describes a key method, presenting an introduction that emphasizes the basic intuition behind each method, tutorial style examples, references to key literature, and exercises. The chapter authors are all experts in the tools and methods they present. The book covers estimation of model parameters using quantitative data; understanding the links between model structure and its behavior; and decision support and optimization. An online appendix offers computer code for applications, models, and solutions to exercises. **Contributors**Wenyi An, Edward G. Anderson Jr., Yaman Barlas, Nishesh Chalise, Robert Eberlein, Hamed Ghoddusi, Winfried Grassmann, Peter S. Hovmand, Mohammad S. Jalali, Nitin Joglekar, David Keith, Juxin Liu, Erling Moxnes, Rogelio Oliva, Nathaniel D. Osgood, Hazhir Rahmandad, Raymond Spiteri, John Sterman, Jeroen Struben, Burcu Tan, Karen Yee, G??nen?? Y??cel

  • Subject Index

    The minimum description length (MDL) principle is a powerful method of inductive inference, the basis of statistical modeling, pattern recognition, and machine learning. It holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data. MDL methods are particularly well-suited for dealing with model selection, prediction, and estimation problems in situations where the models under consideration can be arbitrarily complex, and overfitting the data is a serious concern.This extensive, step-by-step introduction to the MDL Principle provides a comprehensive reference (with an emphasis on conceptual issues) that is accessible to graduate students and researchers in statistics, pattern classification, machine learning, and data mining, to philosophers interested in the foundations of statistics, and to researchers in other applied sciences that involve model selection, including biology, econometrics, and experimental psychology. Part I provides a basic introduction to MDL and an overview of the concepts in statistics and information theory needed to understand MDL. Part II treats universal coding, the information-theoretic notion on which MDL is built, and part III gives a formal treatment of MDL theory as a theory of inductive inference based on universal coding. Part IV provides a comprehensive overview of the statistical theory of exponential families with an emphasis on their information-theoretic properties. The text includes a number of summaries, paragraphs offering the reader a "fast track" through the material, and boxes highlighting the most important concepts.

  • Decision Support and Optimization

    Simulation modeling is increasingly integrated into research and policy analysis of complex sociotechnical systems in a variety of domains. Model- based analysis and policy design inform a range of applications in fields from economics to engineering to health care. This book offers a hands-on introduction to key analytical methods for dynamic modeling. Bringing together tools and methodologies from fields as diverse as computational statistics, econometrics, and operations research in a single text, the book can be used for graduate-level courses and as a reference for dynamic modelers who want to expand their methodological toolbox. The focus is on quantitative techniques for use by dynamic modelers during model construction and analysis, and the material presented is accessible to readers with a background in college-level calculus and statistics. Each chapter describes a key method, presenting an introduction that emphasizes the basic intuition behind each method, tutorial style examples, references to key literature, and exercises. The chapter authors are all experts in the tools and methods they present. The book covers estimation of model parameters using quantitative data; understanding the links between model structure and its behavior; and decision support and optimization. An online appendix offers computer code for applications, models, and solutions to exercises. **Contributors**Wenyi An, Edward G. Anderson Jr., Yaman Barlas, Nishesh Chalise, Robert Eberlein, Hamed Ghoddusi, Winfried Grassmann, Peter S. Hovmand, Mohammad S. Jalali, Nitin Joglekar, David Keith, Juxin Liu, Erling Moxnes, Rogelio Oliva, Nathaniel D. Osgood, Hazhir Rahmandad, Raymond Spiteri, John Sterman, Jeroen Struben, Burcu Tan, Karen Yee, G??nen?? Y??cel

  • References

    The minimum description length (MDL) principle is a powerful method of inductive inference, the basis of statistical modeling, pattern recognition, and machine learning. It holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data. MDL methods are particularly well-suited for dealing with model selection, prediction, and estimation problems in situations where the models under consideration can be arbitrarily complex, and overfitting the data is a serious concern.This extensive, step-by-step introduction to the MDL Principle provides a comprehensive reference (with an emphasis on conceptual issues) that is accessible to graduate students and researchers in statistics, pattern classification, machine learning, and data mining, to philosophers interested in the foundations of statistics, and to researchers in other applied sciences that involve model selection, including biology, econometrics, and experimental psychology. Part I provides a basic introduction to MDL and an overview of the concepts in statistics and information theory needed to understand MDL. Part II treats universal coding, the information-theoretic notion on which MDL is built, and part III gives a formal treatment of MDL theory as a theory of inductive inference based on universal coding. Part IV provides a comprehensive overview of the statistical theory of exponential families with an emphasis on their information-theoretic properties. The text includes a number of summaries, paragraphs offering the reader a "fast track" through the material, and boxes highlighting the most important concepts.

  • Contributors

    Simulation modeling is increasingly integrated into research and policy analysis of complex sociotechnical systems in a variety of domains. Model- based analysis and policy design inform a range of applications in fields from economics to engineering to health care. This book offers a hands-on introduction to key analytical methods for dynamic modeling. Bringing together tools and methodologies from fields as diverse as computational statistics, econometrics, and operations research in a single text, the book can be used for graduate-level courses and as a reference for dynamic modelers who want to expand their methodological toolbox. The focus is on quantitative techniques for use by dynamic modelers during model construction and analysis, and the material presented is accessible to readers with a background in college-level calculus and statistics. Each chapter describes a key method, presenting an introduction that emphasizes the basic intuition behind each method, tutorial style examples, references to key literature, and exercises. The chapter authors are all experts in the tools and methods they present. The book covers estimation of model parameters using quantitative data; understanding the links between model structure and its behavior; and decision support and optimization. An online appendix offers computer code for applications, models, and solutions to exercises. **Contributors**Wenyi An, Edward G. Anderson Jr., Yaman Barlas, Nishesh Chalise, Robert Eberlein, Hamed Ghoddusi, Winfried Grassmann, Peter S. Hovmand, Mohammad S. Jalali, Nitin Joglekar, David Keith, Juxin Liu, Erling Moxnes, Rogelio Oliva, Nathaniel D. Osgood, Hazhir Rahmandad, Raymond Spiteri, John Sterman, Jeroen Struben, Burcu Tan, Karen Yee, G??nen?? Y??cel

  • List of Symbols

    The minimum description length (MDL) principle is a powerful method of inductive inference, the basis of statistical modeling, pattern recognition, and machine learning. It holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data. MDL methods are particularly well-suited for dealing with model selection, prediction, and estimation problems in situations where the models under consideration can be arbitrarily complex, and overfitting the data is a serious concern.This extensive, step-by-step introduction to the MDL Principle provides a comprehensive reference (with an emphasis on conceptual issues) that is accessible to graduate students and researchers in statistics, pattern classification, machine learning, and data mining, to philosophers interested in the foundations of statistics, and to researchers in other applied sciences that involve model selection, including biology, econometrics, and experimental psychology. Part I provides a basic introduction to MDL and an overview of the concepts in statistics and information theory needed to understand MDL. Part II treats universal coding, the information-theoretic notion on which MDL is built, and part III gives a formal treatment of MDL theory as a theory of inductive inference based on universal coding. Part IV provides a comprehensive overview of the statistical theory of exponential families with an emphasis on their information-theoretic properties. The text includes a number of summaries, paragraphs offering the reader a "fast track" through the material, and boxes highlighting the most important concepts.

  • Estimation of Model Parameters

    Simulation modeling is increasingly integrated into research and policy analysis of complex sociotechnical systems in a variety of domains. Model- based analysis and policy design inform a range of applications in fields from economics to engineering to health care. This book offers a hands-on introduction to key analytical methods for dynamic modeling. Bringing together tools and methodologies from fields as diverse as computational statistics, econometrics, and operations research in a single text, the book can be used for graduate-level courses and as a reference for dynamic modelers who want to expand their methodological toolbox. The focus is on quantitative techniques for use by dynamic modelers during model construction and analysis, and the material presented is accessible to readers with a background in college-level calculus and statistics. Each chapter describes a key method, presenting an introduction that emphasizes the basic intuition behind each method, tutorial style examples, references to key literature, and exercises. The chapter authors are all experts in the tools and methods they present. The book covers estimation of model parameters using quantitative data; understanding the links between model structure and its behavior; and decision support and optimization. An online appendix offers computer code for applications, models, and solutions to exercises. **Contributors**Wenyi An, Edward G. Anderson Jr., Yaman Barlas, Nishesh Chalise, Robert Eberlein, Hamed Ghoddusi, Winfried Grassmann, Peter S. Hovmand, Mohammad S. Jalali, Nitin Joglekar, David Keith, Juxin Liu, Erling Moxnes, Rogelio Oliva, Nathaniel D. Osgood, Hazhir Rahmandad, Raymond Spiteri, John Sterman, Jeroen Struben, Burcu Tan, Karen Yee, G??nen?? Y??cel

  • Introductory Material

    The minimum description length (MDL) principle is a powerful method of inductive inference, the basis of statistical modeling, pattern recognition, and machine learning. It holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data. MDL methods are particularly well-suited for dealing with model selection, prediction, and estimation problems in situations where the models under consideration can be arbitrarily complex, and overfitting the data is a serious concern.This extensive, step-by-step introduction to the MDL Principle provides a comprehensive reference (with an emphasis on conceptual issues) that is accessible to graduate students and researchers in statistics, pattern classification, machine learning, and data mining, to philosophers interested in the foundations of statistics, and to researchers in other applied sciences that involve model selection, including biology, econometrics, and experimental psychology. Part I provides a basic introduction to MDL and an overview of the concepts in statistics and information theory needed to understand MDL. Part II treats universal coding, the information-theoretic notion on which MDL is built, and part III gives a formal treatment of MDL theory as a theory of inductive inference based on universal coding. Part IV provides a comprehensive overview of the statistical theory of exponential families with an emphasis on their information-theoretic properties. The text includes a number of summaries, paragraphs offering the reader a "fast track" through the material, and boxes highlighting the most important concepts.

  • Additional Background

    The minimum description length (MDL) principle is a powerful method of inductive inference, the basis of statistical modeling, pattern recognition, and machine learning. It holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data. MDL methods are particularly well-suited for dealing with model selection, prediction, and estimation problems in situations where the models under consideration can be arbitrarily complex, and overfitting the data is a serious concern.This extensive, step-by-step introduction to the MDL Principle provides a comprehensive reference (with an emphasis on conceptual issues) that is accessible to graduate students and researchers in statistics, pattern classification, machine learning, and data mining, to philosophers interested in the foundations of statistics, and to researchers in other applied sciences that involve model selection, including biology, econometrics, and experimental psychology. Part I provides a basic introduction to MDL and an overview of the concepts in statistics and information theory needed to understand MDL. Part II treats universal coding, the information-theoretic notion on which MDL is built, and part III gives a formal treatment of MDL theory as a theory of inductive inference based on universal coding. Part IV provides a comprehensive overview of the statistical theory of exponential families with an emphasis on their information-theoretic properties. The text includes a number of summaries, paragraphs offering the reader a "fast track" through the material, and boxes highlighting the most important concepts.

  • Index

    Simulation modeling is increasingly integrated into research and policy analysis of complex sociotechnical systems in a variety of domains. Model- based analysis and policy design inform a range of applications in fields from economics to engineering to health care. This book offers a hands-on introduction to key analytical methods for dynamic modeling. Bringing together tools and methodologies from fields as diverse as computational statistics, econometrics, and operations research in a single text, the book can be used for graduate-level courses and as a reference for dynamic modelers who want to expand their methodological toolbox. The focus is on quantitative techniques for use by dynamic modelers during model construction and analysis, and the material presented is accessible to readers with a background in college-level calculus and statistics. Each chapter describes a key method, presenting an introduction that emphasizes the basic intuition behind each method, tutorial style examples, references to key literature, and exercises. The chapter authors are all experts in the tools and methods they present. The book covers estimation of model parameters using quantitative data; understanding the links between model structure and its behavior; and decision support and optimization. An online appendix offers computer code for applications, models, and solutions to exercises. **Contributors**Wenyi An, Edward G. Anderson Jr., Yaman Barlas, Nishesh Chalise, Robert Eberlein, Hamed Ghoddusi, Winfried Grassmann, Peter S. Hovmand, Mohammad S. Jalali, Nitin Joglekar, David Keith, Juxin Liu, Erling Moxnes, Rogelio Oliva, Nathaniel D. Osgood, Hazhir Rahmandad, Raymond Spiteri, John Sterman, Jeroen Struben, Burcu Tan, Karen Yee, G??nen?? Y??cel



Standards related to Econometrics

Back to Top

No standards are currently tagged "Econometrics"


Jobs related to Econometrics

Back to Top