Room 465C, 3401 Walnut Street, Philadelphia, PA 19104
This course will serve as an introductory and hands-on dive into the area of deep learning. The main goal is to to educate the students on (i) the commonly-used neural network architectures and proficiency in training them, (ii) Some of the main problems that deep learning systems have successfully addressed (formulation, architecture, data sets, etc). There will be no theory in this course. After finishing this course, the students should be very comfortable with pytorch programming as well as training deep learning models.
ESE3060001 ( Syllabus )
The course covers the methodological foundations of data science, emphasizing basic concepts in statistics and learning theory, but also modern methodologies. Learning of distributions and their parameters. Testing of multiple hypotheses. Linear and nonlinear regression and prediction. Classification. Uncertainty quantification. Model validation. Clustering. Dimensionality reduction. Probably approximately correct (PAC) learning. Such theoretical concepts are further complemented by exempla r applications, case studies (datasets), and programming exercises (in Python) drawn from electrical engineering, computer science, the life sciences, finance, and social networks.
ESE4020401
ESE4020402
The course covers the methodological foundations of data science, emphasizing basic concepts in statistics and learning theory, but also modern methodologies. Learning of distributions and their parameters. Testing of multiple hypotheses. Linear and nonlinear regression and prediction. Classification. Uncertainty quantification. Model validation. Clustering. Dimensionality reduction. Probably approximately correct (PAC) learning. Such theoretical concepts are further complemented by exempla r applications, case studies (datasets), and programming exercises (in Python) drawn from electrical engineering, computer science, the life sciences, finance, and social networks.
ESE5420401
ESE5420402
ESE5420501
For students working on an advanced research program leading to the completion of master's thesis or Ph.D. dissertation requirements.
ESE9990006
Independent Study allows students to pursue academic interests not available in regularly offered courses. Students must consult with their academic advisor to formulate a project directly related to the student’s research interests. All independent study courses are subject to the approval of the AMCS Graduate Group Chair.
Study under the direction of a faculty member.
For master's students studying a specific advanced subject area in computer and information science. Involves coursework and class presentations. A CIS 5990 course unit will invariably include formally gradable work comparable to that in a CIS 500-level course. Students should discuss with the faculty supervisor the scope of the Independent Study, expectations, work involved, etc.
An opportunity for the student to become closely associated with a professor in (1) a research effort to develop research skills and technique and/or (2) to develop a program of independent in-depth study in a subject area in which the professor and student have a common interest. The challenge of the task undertaken must be consistent with the student's academic level. To register for this course, the student and professor jointly submit a detailed proposal to the undergraduate curriculum chairman no later than the end of the first week of the term.
Introduction to a broad range of tools to analyze large volumes of data in order to transform them into actionable decisions. Using case studies and hands-on exercises, the student will have the opportunity to practice and increase their data analysis skills.
This course will serve as an introductory and hands-on dive into the area of deep learning. The main goal is to to educate the students on (i) the commonly-used neural network architectures and proficiency in training them, (ii) Some of the main problems that deep learning systems have successfully addressed (formulation, architecture, data sets, etc). There will be no theory in this course. After finishing this course, the students should be very comfortable with pytorch programming as well as training deep learning models.
The course covers the methodological foundations of data science, emphasizing basic concepts in statistics and learning theory, but also modern methodologies. Learning of distributions and their parameters. Testing of multiple hypotheses. Linear and nonlinear regression and prediction. Classification. Uncertainty quantification. Model validation. Clustering. Dimensionality reduction. Probably approximately correct (PAC) learning. Such theoretical concepts are further complemented by exempla r applications, case studies (datasets), and programming exercises (in Python) drawn from electrical engineering, computer science, the life sciences, finance, and social networks.
The course covers the methodological foundations of data science, emphasizing basic concepts in statistics and learning theory, but also modern methodologies. Learning of distributions and their parameters. Testing of multiple hypotheses. Linear and nonlinear regression and prediction. Classification. Uncertainty quantification. Model validation. Clustering. Dimensionality reduction. Probably approximately correct (PAC) learning. Such theoretical concepts are further complemented by exempla r applications, case studies (datasets), and programming exercises (in Python) drawn from electrical engineering, computer science, the life sciences, finance, and social networks.
Many scientific and commercial applications require us to obtain insights from massive, high-dimensional data sets. In this graduate-level course, students will learn to apply, analyze and evaluate principled, state-of-the-art technique s from statistics, algorithms and discrete and convex optimization for learning from such large data sets. The course both covers theoretical foundations and practical applications.
For students working on an advanced research leading to the completion of a Master's thesis.
This course concentrates on recognizing and solving convex optimization problems that arise in engineering. Topics include: convex sets, functions, and optimization problems. Basis of convex analysis. Linear, quadratic, geometric, and semidefinite programming. Optimality conditions, duality theory, theorems of alternative, and applications. Interior-point methods, ellipsoid algorithm and barrier methods, self-concordance. Applications to signal processing, control, digital and analog circuit design, computation geometry, statistics, and mechanical engineering. Knowledge of linear algebra and willingness to do programming. Exposure to numerical computing, optimization, and application fields is helpful but not required.
Advanced and specialized topics in both theory and application areas. Students should check Graduate Group office for offerings during each registration period.
For students working on an advanced research program leading to the completion of master's thesis or Ph.D. dissertation requirements.
New research from Wharton's Robert Stambaugh and Luke Taylor looks into the financial and social burden of firms' carbon emissions.…Read More
Knowledge @ Wharton - 2024/11/26