SEDAR 63 Benchmark Assessment: Gulf Menhaden Terms of Reference. Terminal Year: 2017

Similar documents
On-Line Data Analytics

PREPARED BY: IOTC SECRETARIAT 1, 20 SEPTEMBER 2017

Software Maintenance

Probability estimates in a scenario tree

How to Judge the Quality of an Objective Classroom Test

BASIC EDUCATION IN GHANA IN THE POST-REFORM PERIOD

The Good Judgment Project: A large scale test of different methods of combining expert predictions

CHAPTER 4: REIMBURSEMENT STRATEGIES 24

Science Fair Project Handbook

GDP Falls as MBA Rises?

Statewide Framework Document for:

Teaching a Laboratory Section

Oklahoma State University Policy and Procedures

H2020 Marie Skłodowska Curie Innovative Training Networks Informal guidelines for the Mid-Term Meeting

MGT/MGP/MGB 261: Investment Analysis

Introduction to Simulation

Generating Test Cases From Use Cases

Probability and Statistics Curriculum Pacing Guide

What is Thinking (Cognition)?

Kristin Moser. Sherry Woosley, Ph.D. University of Northern Iowa EBI

Progress Report (January 2011)

Why Did My Detector Do That?!

Uncertainty concepts, types, sources

Physics 270: Experimental Physics

OPTIMIZATINON OF TRAINING SETS FOR HEBBIAN-LEARNING- BASED CLASSIFIERS

PROPOSAL FOR NEW UNDERGRADUATE PROGRAM. Institution Submitting Proposal. Degree Designation as on Diploma. Title of Proposed Degree Program

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

FY16 UW-Parkside Institutional IT Plan Report

JONATHAN H. WRIGHT Department of Economics, Johns Hopkins University, 3400 N. Charles St., Baltimore MD (410)

The Oregon Literacy Framework of September 2009 as it Applies to grades K-3

Denbigh School. Sex Education and Relationship Policy

College of Arts and Science Procedures for the Third-Year Review of Faculty in Tenure-Track Positions

Office Hours: Mon & Fri 10:00-12:00. Course Description

CHMB16H3 TECHNIQUES IN ANALYTICAL CHEMISTRY

Class Numbers: & Personal Financial Management. Sections: RVCC & RVDC. Summer 2008 FIN Fully Online

USC VITERBI SCHOOL OF ENGINEERING

Arizona s English Language Arts Standards th Grade ARIZONA DEPARTMENT OF EDUCATION HIGH ACADEMIC STANDARDS FOR STUDENTS

Further, Robert W. Lissitz, University of Maryland Huynh Huynh, University of South Carolina ADEQUATE YEARLY PROGRESS

Lecture 1: Machine Learning Basics

LANGUAGE DIVERSITY AND ECONOMIC DEVELOPMENT. Paul De Grauwe. University of Leuven

A GENERIC SPLIT PROCESS MODEL FOR ASSET MANAGEMENT DECISION-MAKING

Expert Reference Series of White Papers. Mastering Problem Management

PM tutor. Estimate Activity Durations Part 2. Presented by Dipo Tepede, PMP, SSBB, MBA. Empowering Excellence. Powered by POeT Solvers Limited

Mathematics Program Assessment Plan

Master in Science in Chemistry with Biomedicine - UMSH4CSCB

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

Applications of data mining algorithms to analysis of medical data

New Jersey Department of Education World Languages Model Program Application Guidance Document

Major Milestones, Team Activities, and Individual Deliverables

Colorado s Unified Improvement Plan for Schools for Online UIP Report

University of New Hampshire Policies and Procedures for Student Evaluation of Teaching (2016) Academic Affairs Thompson Hall

Visit us at:

Institutional review. University of Wales, Newport. November 2010

State University of New York at Buffalo INTRODUCTION TO STATISTICS PSC 408 Fall 2015 M,W,F 1-1:50 NSC 210

Rule-based Expert Systems

Title:A Flexible Simulation Platform to Quantify and Manage Emergency Department Crowding

EECS 571 PRINCIPLES OF REAL-TIME COMPUTING Fall 10. Instructor: Kang G. Shin, 4605 CSE, ;

Institutionen för datavetenskap. Hardware test equipment utilization measurement

Prentice Hall Literature Common Core Edition Grade 10, 2012

Mathematics subject curriculum

STUDENT AND ACADEMIC SERVICES

Evaluation of a College Freshman Diversity Research Program

A guidance for assessing and communicating uncertainties

Evidence for Reliability, Validity and Learning Effectiveness

Task Tolerance of MT Output in Integrated Text Processes

AUTOMATED TROUBLESHOOTING OF MOBILE NETWORKS USING BAYESIAN NETWORKS

UNIVERSITY OF DAR-ES-SALAAM OFFICE OF VICE CHANCELLOR-ACADEMIC DIRECTORATE OF POSTGRADUATE STUDIUES

Director, Ohio State Agricultural Technical Institute

Corpus Linguistics (L615)

Introduction to Causal Inference. Problem Set 1. Required Problems

I N T E R P R E T H O G A N D E V E L O P HOGAN BUSINESS REASONING INVENTORY. Report for: Martina Mustermann ID: HC Date: May 02, 2017

Telekooperation Seminar

Reference to Tenure track faculty in this document includes tenured faculty, unless otherwise noted.

November 2012 MUET (800)

VIEW: An Assessment of Problem Solving Style

Application of Virtual Instruments (VIs) for an enhanced learning environment

Seminar - Organic Computing

National and Regional performance and accountability: State of the Nation/Region Program Costa Rica.

PROCEDURES FOR SELECTION OF INSTRUCTIONAL MATERIALS FOR THE SCHOOL DISTRICT OF LODI

The Effects of Ability Tracking of Future Primary School Teachers on Student Performance

SURVEY RESEARCH POLICY TABLE OF CONTENTS STATEMENT OF POLICY REASON FOR THIS POLICY

Algebra 1, Quarter 3, Unit 3.1. Line of Best Fit. Overview

This Performance Standards include four major components. They are

Probabilistic Latent Semantic Analysis

CAAP. Content Analysis Report. Sample College. Institution Code: 9011 Institution Type: 4-Year Subgroup: none Test Date: Spring 2011

Conceptual Framework: Presentation

Update on the Affordable Care Act. Association of Business Administrators September 24, 2014

Presentation Advice for your Professional Review

Monitoring Metacognitive abilities in children: A comparison of children between the ages of 5 to 7 years and 8 to 11 years

A Pipelined Approach for Iterative Software Process Model

Ryerson University Sociology SOC 483: Advanced Research and Statistics

STUDENT ASSESSMENT, EVALUATION AND PROMOTION

MASTER OF ARTS IN APPLIED SOCIOLOGY. Thesis Option

Oklahoma State University Policy and Procedures

TU-E2090 Research Assignment in Operations Management and Services

Language Acquisition Fall 2010/Winter Lexical Categories. Afra Alishahi, Heiner Drenhaus

FACTORS THAT INFLUENCE THE COLLEGE CHOICE PROCESS FOR AFRICAN AMERICAN STUDENTS. Melanie L. Hayden. Thesis submitted to the Faculty of the

RECRUITMENT AND EXAMINATIONS

Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur

Introduction to Psychology

Transcription:

Data Workshop Terms of Reference SEDAR 63 Benchmark Assessment: Gulf Menhaden Terms of Reference Terminal Year: 2017 1) Review stock structure and unit stock definitions and consider whether changes are required. 2) Review, discuss, and tabulate available life history information. a) Evaluate age, growth, natural mortality, and reproductive characteristics. b) Provide appropriate models to describe growth, maturation, and fecundity by age, sex, or length as applicable. c) Evaluate the adequacy of available life-history information for conducting stock assessments and recommend life history information for use in population modeling. d) Evaluate, discuss, and characterize the sources of uncertainty, and data limitations (such as 3) Provide measures of population abundance that are appropriate for stock assessment. a) Consider and discuss all available and relevant fishery-independent data sources including: i) State long term monitoring programs and ii) SEAMAP surveys (plankton, trawl, and other). b) Document all programs evaluated; address program objectives, methods, coverage, sampling intensity, and other relevant characteristics. c) Develop fishery and survey CPUE indices by appropriate strata (e.g., age, size, area, and fishery) and include measures of precision and accuracy. d) Discuss the degree to which available indices adequately represent population conditions. e) Recommend which data sources adequately and reliably represent population abundance for use in assessment modeling. f) Evaluate, discuss, and characterize the sources of uncertainty, and data limitations (such as 4) Evaluate and discuss all available fishery-dependent catch statistics and data sources including: a) Reconstructed early landings b) Reduction Landings to present c) Bait Landings to present d) Recreational Landings to present e) Nominal Effort Data to present f) Catch at age matrix g) Evaluate, discuss, and characterize the sources of uncertainty, and data limitations (such as

5) Provide recommendations for future research in areas such as life history, fishery monitoring, and survey sampling. 6) Prepare the Data Workshop report providing complete documentation of workshop actions and decisions in accordance with project schedule deadlines (Section II. of the SEDAR assessment report).

Assessment Workshop Terms of Reference 1) Review any changes in data following the data workshop and any analyses suggested by the data workshop. Summarize data as used in each assessment model. Provide justification for any deviations from Data Workshop recommendations. 2) Use population assessment models consistent with the available data. Consider the modeling recommendations from the last benchmark assessment review, and discuss how they were addressed in this assessment. Recommend models and configurations considered most reliable or useful for providing advice. Document all input data, assumptions, and equations for each model prepared. 3) Evaluate models used to estimate population parameters and provide estimates of the population parameters (e.g., F, biomass, abundance, selectivity, and other parameters as appropriate) and biological reference points. a) Did the model have difficulty finding a stable solution? b) Were sensitivity analyses for any priors performed? Were other model diagnostics performed? c) Have the model strengths and limitations been clearly and thoroughly explained? d) Have the models been used in other peer reviewed assessments? If not, has new model code been verified with simulated data? e) Compare and discuss differences among alternative models. f) Provide appropriate measures of model performance, reliability, and goodness of fit 4) Characterize uncertainty in the assessment and estimated values. a) Consider uncertainty in input data, modeling approach, and model configuration. b) Provide measures of uncertainty for relevant model output. 5) Perform retrospective analyses, assess magnitude and direction of retrospective patterns if detected, and discuss implications of any observed retrospective pattern for uncertainty in population parameters (e.g., F, SSB), reference points, and management measures. 6) Provide evaluations of yield and productivity a) Include yield-per-recruit, spawner-per-recruit, and stock-recruitment evaluations. 7) Provide estimates of population benchmarks or management criteria consistent with the available data and applicable FMP. Evaluate existing management criteria as specified in the management summary. Recommend proxy values when necessary. 8) Provide declarations of stock status relative to the management benchmarks or, if necessary, alternative data-poor approaches. 9) Provide an analysis describing the uncertainty of the proposed stock status relative to the reference points. 10) Develop detailed short and long-term prioritized lists of recommendations for future research, data collection, and assessment methodology. Highlight improvements to be made by next benchmark

review. 11) Complete the Assessment Workshop Report for Review.

Review Workshop Terms of Reference 1) Evaluate the data used in the assessment, addressing the following: a) Are data decisions made by the Data and Assessment Workshop sound and robust? b) Are data uncertainties acknowledged, reported, and within normal or expected levels? c) Are data applied properly within the assessment model? d) Are input data series reliable and sufficient to support the assessment approach and findings? 2) Evaluate the methods used to assess the stock, taking into account the available data. a) Are methods scientifically sound and robust? b) Are assessment models configured properly and used consistent with standard practices? c) Are the methods appropriate for the available data? 3) Evaluate the assessment findings with respect to the following: a) Are abundance, exploitation, and biomass estimates reliable, consistent with input data and population biological characteristics, and useful to support status inferences? b) Is the stock overfished? What information helps you reach this conclusion? c) Is the stock undergoing overfishing? What information helps you reach this conclusion? d) Is there an informative stock recruitment relationship? Is the stock recruitment curve reliable and useful for evaluation of productivity and future stock conditions? e) Are the quantitative estimates of the status determination criteria for this stock appropriate for management use? If not, are there other indicators that may be used to inform managers about stock trends and conditions? 4) Consider how uncertainties in the assessment, and their potential consequences, are addressed. a) Comment on the degree to which methods used to evaluate uncertainty reflect and capture the significant sources of uncertainty in the population, data sources, and assessment methods b) Ensure that the implications of uncertainty in technical conclusions are clearly stated. 5) Consider the research recommendations provided by the Data and Assessment workshop and make any additional recommendations or prioritizations warranted. a) Clearly denote research and monitoring that could improve the reliability of, and information provided by, future assessments. b) Provide recommendations on possible ways to improve the SEDAR process. 6) Provide guidance on key improvements in data or modeling approaches which should be considered when scheduling the next assessment. 7) Prepare a Peer Review Summary summarizing the Panel s evaluation of the stock assessment and addressing each Term of Reference. Develop a list of tasks to be completed following the workshop. Complete and submit the Peer Review Summary Report in accordance with the project guidelines. The panel shall ensure that corrected estimates are provided by addenda to the assessment report in the event corrections are made in the assessment, alternative model configurations are recommended, or additional analyses are prepared as a result of review panel findings regarding the TORs above.