Class Notes (808,765)
United States (313,258)
Marketing (3)
MKT 410 (1)
Lecture 1

MKT 410 Lecture 1: Market Research

56 Pages
Unlock Document

Colorado State University
MKT 410
Gina S Mohr

Introduction 1/18/2017 3:48:00 PM Marketing research and decision making -Marketing research: links consumer, customer, and public to the marketer through information used to: • Identify and define marketing opportunity and problems • Generate, redefine and evaluate marketing actions • Monitor marketing performance • Improve understanding of marketing as a process • Specifies the information required to address these issues, designs the method for collecting information, manages and implements the data collection process, analyzes the results, communicates the findings and their implications The role of marketing research -Descriptive: gathering and presenting statements of fact -Diagnostic (prescriptive): describes the effect that one variable has on another variable, optimization -Predictive: using descriptive and diagnostic data to make predictions about unknown values of a variable of interest Quality and satisfaction -Return on quality: management objective based on the principles that: • The quality being delivered is at a level desired by the target market • The level of quality must have a positive impact on profitability o Satisfaction surveys and sampling o Customer lifetime value o Waking the sleeping dog-don’t remind people they have to make a decision Trends in marketing intelligence -Increased emphasis on secondary data collection methods -Movement toward technology-based data management -Expanded use of digital technology for information acquisition -Movement toward an information management environment -A broader international client base Marketing planning process -Situation analysis • Understand the environment and the market • Identify threats and opportunities • Assess the competitive position -Strategy development • Define the business scope and served market segments • Establish competitive advantages • Set performance objectives -Marketing program development • Product and channel decision • Communication decisions • Pricing • Personal selling decisions -Implementation • Performance monitoring • Refining strategies and program The Marketing Research Process and Research Design 1/18/2017 3:48:00 PM The research process -The development of a research purpose that links the research to decision making, and the formulation of research objectives that serve to guide the research, are unquestionably the most important steps in the research process Marketing research process -Define the research problem • What should the management do (management decision problem) and what information is needed (marketing research problem)? -Develop the approach to the problem • Formulating an analytical framework and models, research questions, hypotheses (research objective) -Formulate the research design • A detailed blueprint used to guide a research study toward its objectives • Research approach: o Exploratory (focus group) ▪ To formulate a problem or define a problem more precisely ▪ To identify alternative courses of action ▪ To develop hypotheses ▪ To isolate key variables and relationships for further examination ▪ To gain insights for developing an approach to the problem ▪ To establish priorities for further research o Descriptive ▪ To develop a profile of a target market ▪ Who, What, When, Where, sometimes Why ▪ To estimate the frequency of product use as a basis for sales forecast ▪ To determine the relationship between product use and perception of product characteristics ▪ To determine the degree to which marketing variables are associated o Causal ▪ To understand which variables are the causes (independent variables) and which are the effects (dependent variables) of a phenomenon ▪ To determine the extent of the relationship between the predicted effect and the causal variables -Do fieldwork or collect data -Prepare and analyze the data -Prepare and present the report Research design -DRAW Defining the marketing research problem 1/18/2017 3:48:00 PM Marketing research process -Identification of the problem and statement of the research objectives -Creation of the research design -Choice of method of research -Selection of the sampling procedure -Collection of the data -Analysis of the data -Writing and presentation of the report -Follow-up Translate the management problem into a marketing research problem -Management decision problem: a statement specifying the type of managerial action required to solve the problem -Marketing research problem: a statement specifying the type of information needed by the decision maker to help solve the management decision problem and how that information can be obtained efficiently and effectively -Marketing research objective: a goal statement, defining the specific information needed to solve the marketing research problem The marketing research problem (a template-conceptual map) -Management wants to (take action) -Therefore, we should study (topic) -So that we can explain (question) -Features to keep in mind: • Symptom vs. causes • Actions vs. information Problem definition -Involves stating the general problem (problem identification) and identifying the specific components of the marketing research problem -The marketing research problem is information oriented and focuses on underlying causes-the management problem is action oriented and focuses on symptoms -Discussions with decision makers -Interview with industry experts -Secondary data analysis -Qualitative and exploratory research -Past information and forecasts -Resource constraints and objectives -Buyer behavior -Legal, economic, marketing, technological environments Developing a research objective -The research objective includes: • Research questions (specific and relevant to the purpose) • Hypotheses • Scope (bounds describing geography, segment, etc.) Secondary and standardized sources of marketing data 1/18/2017 3:48:00 PM Secondary data -Data that were collected by persons or agencies for purposes other than solving a specific problem at hand -May solve the problem -Idea generation -Define the problem/formulate hypotheses -Adopt measures/methods used by other researchers -Population definition -Benchmark/validation Nature of secondary data -Benefits: • Low cost • Less effort and time • Sometimes more accurate • Only option • May be sufficient -Limitations: • Collected for some other purpose • No control over data collection • Accuracy, format, outdated • May not meet data requirements • Assumptions Internal sources of secondary data -Sales/patronage results (outcomes) -Marketing activity (inputs) -Cost information -Distributor reports and feedback -Customer feedback -Accounting data -Sales reports -Inventory management Customer databases -Data mining: use of statistical and other advanced software to discover nonobvious patterns hidden in a database • Purchase propensity • Bottom up strategy -Behavioral targeting: use of online and offline data to understand a consumer’s habits, demographics, social networks in order to increase the effectiveness of online advertising • Predictive analytics approach North American Industrial Classification System (NAICS) -Provides a consistent framework for the collection, analysis and dissemination of industrial statistics -Identified by a 6-digit code and has 20 broad sectors Qualitative research methods 1/18/2017 3:48:00 PM Research approach -Exploratory research: discover ideas and insights to better understand the problem at hand -Descriptive research: collect information that provides answers to research questions (who, what, when, where, how) -Causal research: test cause-and-effect relationships between marketing variables Research methods -Quantitative research: places heavy emphasis on using formal standard questions and predetermined response options in questionnaires or surveys administered to large numbers of respondents -Qualitative research: relies on the collection of data in the form of text or images using open-ended questions, observation, or “found” data Goals of quantitative research -Make accurate predictions about relationships between market factors and behaviors -Quantify group differences -Validate relationships -Test hypotheses Goals of qualitative research -Gain preliminary insight into research problems -Probe more deeply into areas that quantitative research is too superficial to access -Provide initial ideas about specific problems, theories, relationships, variables, scale design Methods of exploratory research -Focus group • Formal process of bringing a small group of people (8-12) together for an interactive, spontaneous discussion on one particular concept or topic • Led by a moderator and generally lasts around 90 minutes • Good for idea generation, brainstorming, understanding customer vocabulary • Can be helpful in gaining insight to motives, attitudes, perceptions • Can reveal needs/likes and dislikes/prejudices driven by emotions • Group dynamics: moderator must manage this factor deftly • Designing a focus group o Critical steps ▪ Problem identification/research question ▪ Sampling frame ▪ Choose a moderator ▪ Generate an interview guide ▪ Recruit the sample ▪ Conduct the focus group ▪ Analyze the data ▪ Write the report ▪ Decision making o 3 components ▪ The participants: Selection process  Potential opinion leaders are best  Participants must be screened for relevance to the topic ▪ The focus group facility:  A research facility consisting of a conference room or living room setting and a separate observation room with a one-way mirror or live audiovisual feed ▪ The moderator:  A person hired by the client to lead the focus group (should have background in psychology/sociology/marketing) o Moderator guide ▪ Introduction ▪ Simple, fun activity ▪ Research topic questions  Broad to specific  Behaviors then attitudes  Positive before negative  Unaided before aided  Respondent categories before your categories ▪ Projective technique/intervention ▪ Wrap-up and closing -In-depth interview • Individual in-depth interview: formal interview process in which a well-trained interviewer asks a respondent a set of semi- structured questions in face-to-face setting • Objectives: o Discover what the respondent thinks about a topic o Understand feelings, beliefs, opinions of a subject and why they exist o Encourage subject to communicate as much detail as possible • Advantages: o Group pressure eliminated o Respondent feels important and truly wanted o Respondent attains a heightened state of awareness o Encourages the revelation of new info o Respondents can be questioned at length to reveal feelings and motivations o Individual interviews allow greater flexibility to the direction of questioning o Interviewer becomes more sensitive to nonverbal feedback o Singular viewpoint can be obtained without influence from others o Interviews can be conducted anywhere • How is an IDI different from structured interviews (intercepts)? o Emphasis on generality of the topic (not answers to specific questions) o Greater emphasis on respondent’s point of view o “Rambling” is encouraged o Interviewer can depart from interview guide and ask follow- up questions o More flexible and often led by the respondent but guided by the interviewer o Rich answers are desired as opposed to easily coded predetermined response options • IDI: nondirective o Nondirective (unstructured): respondent is given maximum freedom to respond (within bounds of topics of interest to the interviewer) ▪ Establishing rapport ▪ Ability to probe in order to elaborate on interesting responses ▪ Guiding discussion back to the topic ▪ Always pursuing reasons behind the comments and answers • Semi-structured: interviewer attempts to cover a specific list of topics o Effective with busy executives, technical experts, thought leaders o Gaining access o Establish rapport and credibility o Record keeping can be tricky o Snowballing • Steps in conducting IDIs o Understand question/problem o Create interview guide o Decide on best environment for interview o Recruit and screen respondents o Introduce respondent to interview process o Conduct interviews o Analyze respondent responses o Write report • Preparing the interview guide o Create some order given the topic areas o Use language that’s comprehensible among your respondents o Remember to collect basic demographics -Observation • Systematic witnessing and recording of behavioral patterns of objects, people, and events without directly communicating with them (mystery shopping is an exception) • Interviewing vs. observation o DRAW • Conditions for using observation o The needed info must be either observable or inferable o The behavior should be repetitive, frequent, or in some manner predictable ▪ Some defined behavior that you can record o The behavior must be relatively short in duration • The nature of observation o Natural vs. contrived ▪ Contrived: behavioral projective tests, people are placed in a contrived situation, and their attitudes, beliefs and motives can be observed o Open vs. disguised (overt vs. covert) ▪ Open: Participant knows they’re being observed ▪ Disguised: process of monitoring people who don’t know they’re being watched o Human vs. machine ▪ “Movie cameras, audiovisual equipment, and software record behavior much more objectively and in greater detail than human observers ever could” o Structured vs. unstructured ▪ Casual/unstructured: occurs continuously ▪ Structured: involves the use of observation schedules/instruments o Direct vs. indirect ▪ Content (direct) analysis: an observation technique used to analyze written material into meaningful units, using carefully applied rules ▪ Physical trace measures (indirect): empty bottles, radio dials, glue spots in magazines, rate of wear on floor tiles in a museum • Mystery shopping o The purpose of a mystery shopper is to gather observational data about a store and to collect data about customer-employee interactions o Classified as an observational marketing research method, even though communication is often involved • Why conduct mystery shopping? o Enabling an organization to monitor compliance with product/service delivery standards, and specifications o Enabling marketers to examine the gap between promises made through advertising/sales promotion and actual service delivery o Helping monitor the impact of training and performance improvement initiatives on compliance with or conformance to product/service delivery specifications o Identifying differences in the customer experience across different times of day, locations, product/service types, and other potential sources of variation in product/service quality • Advantages o You see what people actually do rather than what they say they do o Firsthand info is less prone to biases o Observational data can be executed quickly and relatively accurately o Electronic collection such as scanners is more efficient than manual counts o Clients can observe their customers along with the researcher • Limitations o Only physical or behavior can be measured o Can’t measure attitudes, beliefs, intentions, feelings o Not always a good representation of the general population o Interpretation is somewhat subjective depending on observation type o Data analysis is generally more qualitative than quantitative o Can be expensive and time consuming if subjects not readily available o Data can be time sensitive making predictive analysis tricky -Ethnography • Form of qualitative data collection that records behavior in natural settings to understand how social and cultural influences affect individual behaviors and experiences • Participant observers: ethnographers can use their intimacy with the people they’re studying to gain richer, deeper insights into culture and behavior-what makes people do what they do • Informal and ongoing in-depth interviewing • Ethnographers pay close attention to words, metaphors, symbols, stories people use to explain their lives and communicate with one another • What counts as “cultural analysis”? o The process of uncovering meaning that consumers have attached to products o This may be revealed through the words, metaphors, symbols, stories people use to explain their lives and communicate with one another • How are ethnographic interviews different from personal in- depth interviews? o Perspective (first learning, suspend our own judgment) o Kinds of questions they ask (stories) o Kinds of details they pay attention to (how things are said, what’s implied, what’s not said)-everything counts as data • Advantages o Test new products in real context o Shows how, when, where, why people shop for brands o Reveal product problems o Can form better relationships with your consumers based on an intimate knowledge of their lifestyles Qualitative data analysis 1/18/2017 3:48:00 PM Listing, examples of blank, 5 multiple choice from homework assignments (not company specific-only information also presented in class), know what makes each method unique Features of qualitative data analysis -Form: Text, pictures, images • Text is an interpretation that can never be judged true or false o Meaning is negotiated among a community of researchers -Goals: get “behind the numbers” -Inductive: important categories are identified as well as patterns and relationships -Interrelated: the whole is always understood to be greater than the sum of its parts -Iterative and reflexive: begins as data are collected and focus is refined Qualitative vs. quantitative data analysis -Focus on meanings rather than quantifiable phenomena -Collection of data on a few cases rather than a few data on many cases -Study in depth and detail, without predetermined categories or directions, rather than emphasis on analyses and categories determined in advance (sometimes) -Sensitivity to context rather than seeking generalizability -Goal of rich descriptions of the world rather than measurement of specific variables Qualitative data analysis (not on test) -Step 1: data reduction • Data reduction: categorization and coding of data that’s part of the theory development process in qualitative data analysis • Categorization: placing portions of transcripts into similar groups based on content • Code sheet: sheet of paper that lists the different themes or categories for a particular study • Codes: labels or numbers that are used to track categories in a qualitative study • Comparison: process of developing and refining theory and constructs by analyzing the differences and similarities in passages, themes, types of participants • Iteration: working through the data several times in order to modify early ideas and to be informed by subsequent analyses • Memoing: writing down thoughts as soon as possible after each interview, focus group, site visit • Negative case analysis: deliberately looking for cases and instances that contradict the ideas and theories that researchers have been developing -Step 2: data display • Table of central themes • Diagrams • Table of comparison of categories • Matrix including quotes for themes • Consensus map -Step 3: conclusion drawing/verification • Credibility: degree of rigor, believability, trustworthiness established by qualitative research Strategy -Frequency: what is said most frequently isn’t always most important -Specificity: concrete examples are given more emphasis -Emotion: give more weight to themes that are accompanied with emotion, enthusiasm, passion, intensity -Extensiveness: how many different people said the same thing -Beware of personal bias or preexisting opinions about the topic Descriptive research 1/18/2017 3:48:00 PM Descriptive research designs -Descriptive research designs are used when: • The research purpose is to describe characteristics of existing market situations or to evaluate current marketing mix strategies • Research question includes issues (who, what, when, where, how) for target populations or marketing strategies • Task is to identify relationships between variables (correlation) or determine group differences -Ask questions: survey research -Observe behaviors: scanner data -Observational technique: exploratory and descriptive, qualitative and quantitative -Marketing analytics: observational data in descriptive research designs (quantitative) -Marketing research: observational data in exploratory research designs (qualitative) -Descriptive research designs: marketing research will focus on survey research Information from surveys -Attitudes (cognitive, affective, intentions) • Awareness • Knowledge • Perceptions • Features • Availability • Pricing • Overall assessments (favorability) -Decisions • Not past outcomes but processes • Needs, desires, preferences, motives, goals • What, where, when, how often of behaviors -Individual factors • Lifestyle • Demographics Survey research isn’t that easy -Correctly define your population and sample from your population (representatives) -Respondents are willing and able to cooperate, understand the questions, have the knowledge/opinions/attitudes/facts required -Interviewer understands and records the responses correctly -Collectively, we call this “error” Types of error in survey research -Random error or random sampling error: error that results from chance variation (the difference between the sample value and the true value of the population mean) -Systematic error or bias: error that results from problems or flaws in the execution of the research design (“non-sampling error”) • Errors we can control • Sample design error: systematic error that results from an error in the sample design or sampling procedures • Measurement error: systematic error that results from a variation between the info being sought and what is actually obtained by the measurement process • Input error: error that results from the incorrect transfer of info from a survey document to a computer • Nonresponse bias: error that results from a systematic difference between those who do/don’t respond to the measurement instrument • Refusal rate: percentage of persons contacted who refuse to participate in a survey o Opposite of the cooperation rate • Response bias: error that results from the tendency of people to answer a question incorrectly through either deliberate falsification or unconscious misrepresentation Survey research methods -Person-administered/computer-assisted • Home, office, intercept, telephone survey, CATI -Self-administered • Email, mail, mobile • No interviewer, only a researcher • Disadvantages: o Respondents often take a long time, researcher might overcomplicate the survey since respondents have more time o No one present to explain things to the respondent and clarify responses to open-ended questions o Low response rates, mailing list often out of date, not sure who completed the survey, manual data entry common o Can’t verify responder-security issues not always representative of the population Measurement 1/18/2017 3:48:00 PM Survey-use it on exam: what level of scale certain things are, critique items Measurement -The process of developing methods to systematically characterize or quantify information about persons events, ideas, objects of interest -“…The assignment of numerals to objects or events according to rules” -Rule: the guide, method, command that tells a researcher what to do Measurement process -We’re measuring concept/construct selection/development -DRAW? -We use scale measurement to precisely measure it -We can be confident in our measurement with reliability and validity Measurement process steps -Identify the concept of interest • Concept: abstract ideas generalized from particular facts o Category of thought used to group sense data together “as if they were all the same” • “When a concept is created or used for special scientific purposes, concepts are called constructs” -Develop a construct • Construct: hypothetical variable made up of a set of component responses or behaviors that are thought to be related o Specific types of concepts that exist at higher levels of abstraction o Ex: brand loyalty, high-involvement purchasing, social class, personality, channel power o Can’t directly observe but estimate -Define the construct constitutively • Constitutive definition: a statement of the meaning of the central idea of concept under study, establishing its boundaries (theoretical or conceptual definition) -Define the construct operationally • Operational definition: statement of precisely which observable characteristics will be measured and the process for assigning a value to the concept -Develop a measurement scale • Measurement requirements: o One-to-one correspondence between the symbol and the characteristic o Rules for assignment must be invariant over time and the objects being measured -Evaluate the reliability and validity of the measurement • DRAW • Scale reliability o Degree to which measures are free from random error and provide consistent data ▪ The extent to which the survey responses are internally consistent ▪ Reliable measurement doesn’t change when the concept being measured remains constant in value o True difference in characteristic is being measured o Differences due to stable characteristics of individual respondents, ST personal factors, sampling of items included in questionnaire, lack of clarity in measurement instrument, mechanical/instrument factors o Differences caused by situational facts o Differences resulting from variations in administering the survey • Testing scale reliability o Test and retest: ability of the same instrument to produce consistent results when used a second time under conditions as similar as possible to the original conditions o Stability: lack of change in results from test to test o Equivalent form: ability of two very similar forms of an instrument to produce closely correlated results o Internal consistency: ability of an instrument to produce similar results when used on different samples during the same time period to measure a phenomenon o Split half technique: a method of assessing the reliability of a scale by dividing the total set of measurement items in half and correlating the results • Scale validity o Reliability is a necessary but not sufficient condition for scale validity ▪ May be one that’s consistently wrong o Validity: scale measures what it is supposed to measure • Testing scale validity o Face validity: degree to which a measurement seems to measure what it’s supposed to measure o Content validity: representativeness (sampling adequacy) of the content of a measurement instrument o Criterion related: degree to which a measurement instrument can predict a variable that’s designated a criterion o Construct: degree to which a measurement instrument represents and logically connects (via underlying theory) the observed phenomenon to the construct o Predictive: degree to which a future level of a criterion can be forecast by a current measurement scale o Concurrent: degree to which another variable (measured at same point in time as the variable of interest) can be predicted by the measurement instrument o Convergent: degree of correlation among different measures that purport to measure the same construct o Discriminate: measure of the lack of association among constructs that are supposed to be different Scale levels -Nominal scale • Objects are assigned to mutually exclusive, labeled categories, but there are no necessary relationships among the categories o No ordering and don’t reflect relative amounts of what is being measured • Nothing to compare • No quantity • Statistics: mode, frequency, percentages • Ex: yes/no questions, gender, race/ethnicity -Ordinal scale • Objects are ranked or arranged in order with regard to some common variable o Not possible to determine how much more or less from this type of scale • Statistics: media, percentiles, frequency, percentages, scale is ranges (0, 2-3, 4-5, 6+)-you don’t know where the respondent is in the range they pick • Ex: best/worst liked, income level, comparison rankings • Quantity of something -Interval scale • Ratios of the differences between scale points can be determined o Allows for standard comparison of measurements between objects o In other words, the ranking also represents equal increments of the attribute being measured o However, ratios of the absolute scale values can’t be determined • Assume difference between scale points is standard amount • Statistics: median, mean, measures of variability (variance and standard deviation) • Ex: on a “1-7” scale • Ratings -Ratio scale • Comparisons between responses can be made and there is a “true natural zero” o Ratio-scaled data can be transformed into interval, ordinal, nominal-scaled data • Statistics: ratios, measures of central tendency (median and mean) and measures of variability (variance and standard deviation) • Ex: age, number of children, dollars in income • Actual/exact number Attitude measurement -Attitudes: mental states used by individuals to structure the way they perceive their environment and guide the way they respond to it -Cognitive: a person’s info about an object (awareness, beliefs, judgments) -Affective: summarizes a person’s overall feelings toward an object or situation -Intention/action: a person’s expectations of future behavior toward an object When do attitudes predict behavior? -Involvement of the consumer (high involvement) -Attitude measurement (level of abstraction and time frame) -Effects of other people (social compliance) • Some sort of social pressure -Situational factors (time pressure, sickness, holidays) -Effects of other brands (favorability may be higher for other brands) • Ask about competitors -Attitude strength (strong attitudes) Scaling approaches and considerations -Unidimensional: measures only one attribute of a concept, respondent, object • Can be a composite measure of multiple items -Multidimensional: measures several dimensions of a concept, respondent, or object -Extent of category description: all categories/polar categories -Treatment of respondent uncertainty: neutral or “don’t know” response -Balance of favorable and unfavorable categories: equal/unbalanced -Comparison judgment required: yes/no Itemized rating scale -Unidimensional • All categories are labeled, no neutral or “I don’t know”, balanced, no comparison -Multidimensional • All categories, “don’t know”, unbalanced, comparison (average, below average) -Associative scaling • Phone interviews Order-rank scales -Cautions: • Require more mental effort and respondents to make choices they may otherwise not make • Limit 5-6 -Ordinal • You don’t know to what extent one thing is ranked higher than the other Paired-comparison scales - Constant sum scale -Difficult for respondents -Beneficial for researchers who want to understand the relative amount of importance Sematic-differential scales -Brand personalities -Polar opposites -Exploratory research identifies meaningful dimensions -Negative and positive poles should be rotated to avoid halo effects • So people don’t stay on one side of the scale • Increases attention rates -Generally treated as an interval scale Likert scales -Interval scale -Statements can come from exploratory research Multiple item scales -General guidelines • Determine clearly what it is that you want to measure • Generate as many items as possible • Ask experts to evaluate the initial pool of items • Determine the type of attitudinal scale to be used • Include validation items in the scale • Administer the items to an initial sample (pretest) • Evaluate and refine the items of the scale • Optimize the scale length Q-sort scaling -Dragging pictures into boxes Graphic rating scale -Could be language/age barriers Survey design 1/18/2017 3:48:00 PM Steps in questionnaire design -Confirm research objectives -Format the questionnaire -Develop questions/scaling and wording -Determine layout and evaluate questionnaire -Pretest, revise, finalize questionnaire -Be mindful of the research objective, respondent, data format Considerations -Begin with simple questions and then progress to more difficult ones -Ask personal questions at the end -Place sensitive questions towards the end -Avoid asking questions using a different measurement format in the same section of the questionnaire -End with a thank-you statement Develop questions and scaling: open-ended questions -For intro to a survey or a topic -When it’s important to measure the salience of an issue to a respondent -When there are too many responses to be listed, or they can’t be foreseen -When verbatim responses are desired to give the flavor of people’s answers or to cite examples -When the behavior to be measured is sensitive/disapproved -Disadvantages • Variability in the clarity and depth of the responses depends on: o Articulateness of the respondent in personal interview o Willingness to compose a written answer for a mail survey o Interviewer’s ability to record the verbatim answers quickly • Interview bias while recording response • Time consuming • Involves subjective judgments/prone to error • Expensive • Answers expand or contract depending on the spac
More Less

Related notes for MKT 410

Log In


Don't have an account?

Join OneClass

Access over 10 million pages of study
documents for 1.3 million courses.

Sign up

Join to view


By registering, I agree to the Terms and Privacy Policies
Already have an account?
Just a few more details

So we can recommend you notes for your school.

Reset Password

Please enter below the email address you registered with and we will send you a link to reset your password.

Add your courses

Get notes from the top students in your class.