Research methodology: a step by step guide for beginners/ Ranjit Kumar, editor Aly Owen

By: Kumar, RanjitContributor(s): Owen, Aly edMaterial type: TextTextPublication details: New Delhi : Pearson, 2005Description: 332p. : ill. ; 23cmISBN: 9788131704967Subject(s): Research methods | Social Sciences | InvestigationDDC classification: 001.42
Contents:
1 Research: a way of thinidng Research: a way of examining your practice Applications of research Definitions of research Characteristics of research Types of research Application Objectives Inquiry mode Paradigms of research Summary 2 The research process: a quick glance The research process: an eight-step model Steps in planning a research study Step I: formulating a research problem Step II: conceptualising a research design Step III: constructing an instrument for data collection Step IV: selecting a sample Step V: writing a research proposal Steps in conducting a study Step VT: collecting data Step VII: processing data Step VIII: writing a research report Summary step I Fonnulating a research problem 3 Reviewing the iiterature Place of literature review in research Bring clarity and focus to your research problem Improve your methodology Broaden your knowledge base in your research area Contextualise your findings Procedure for reviewing the literature Search for existing literature Review the literature selected Develop a theoretical framewo^^ Develop a conceptual framewmjis-' • Writing up the literature reviewed Summary 4 Fonnuiating a research probiem The research problem The importance of formulating a research problem Sources of research problems Considerations in selecting a research problem Steps in the formulation of a research problem The formulation of objectives Establishing operational definitions Summary 5 Identifying variables The definition of a variable The difference between a concept and a variable Concepts, indicators and variables Types of variable From the viewpoint of causation From the viewpoint of the study design From the viewpoint of the unit of measurement Types of measurement scale The nominal or classificatoiy scale The ordinal or ranking scale The interval scale The ratio scale Summary 6 Constructing hypotheses The definition of a hypothesis TTie functions of a hypothesis The characteristics of a hypothesis Types of hypothesis Errors in testing a hypothesis Summary step II Conceptualising a research design 7 The research design The definition of a research design The functions of a research design Summary 8 Selecting a study design Study designs based on the number of contacts The cross-sectional study design The before-and-after study design The longitudinal study design Study designs based on the reference period The retrospective study design The prospective study design The retrospective-prospective study design Study designs based on the nature of the investigation , The experimental study designs Others—some commonly used study designs Action research Feminist research The cross-over comparative experimental design The replicated cross-sectional design Trend studies Cohort studies Panel studies Blind studies Double-blind studies Case studies Summary Step Hi Constructing an instrument for data collection 9 Selecting a method of data collection Methods of data collection Collecting data using primary sources Observation Types of observation Problems with using observation as a method of data collection Situations in which observation can be made The recording of observation The interview Unstructured interviews Structured interview The questionnaire Choosing between an interview schedule and a questionnaire Different ways of administering a questionnaire The contents of the covering letter Advantages of a questionnaire Disadvantages of a questionnaire Advantages of the interview Disadvantages of the interview Forms of question Advantages and disadvantages of open-ended questions Advantages and disadvantages of closed-ended questions Considerations in formulating questions The construction of a research instrument Asking personal and sensitive questions The order of questions Prerequisites for data collection Collecting data using secondary sources Problems with using data from secondary sources Summary 10 Collecting data using attltudlnal scales Functions of attitudinal scales Difficulties in developing an attitudinal scale . Types of attitudinal scale The summated rating or Likert scale The equal-appearing interval or Thurstone scale The cumulative or Guttman scale The relationship between attitudinal and measurement scales Summary 11 Establishing the validity and reliability of a research Instrument n The concept of validity Types of validity Face and content validity Concurrent and predictive validity Construct validity The concept of reliability Factors affecting the reliability of a research instrument Methods of determining the reliability of an instrument External consistency procedures Internal consistency procedures Summary Step IV Selecting a sample 12 Sampling The concept of sampling The concept of sampling in qualitative research Sampling terminology Principles of sampling Factors affecting the inferences drawn from a sample Aims in selecting a sample Types of sampling Random/probability sampling designs Non-random/non-probability sampling designs 'Mixed' sampling designs The calculation of sample size Summary Step V Writing a research proposal 13 Writing a research proposal The research proposal Contents of a research proposal Preamble/introduction The problem Objectives of the study Hj^otheses to be tested Study design The setting Measurement procedures Ethical issues Sampling Analysis of data Structure of the report Problems and limitations Appendix Work schedule Summary Step VI Collecting data 14 Considering ethical issues In data collection Ethics Stakeholders in research Ethical issues concerning research participants Collecting information Seeking consent Providing incentives Seeking sensitive information The possibility of causing harm to participants Maintaining confidentiality Ethical issues relating to the researcher Avoiding bias Provision or deprivation of a treatment Using inappropriate research methodology Incorrect reporting '' Inappropriate use of information Ethical issues regarding the sponsoring organisation Restrictions imposed by the sponsoring organisation The misuse of information Summary Step Vll Processing data 15 Processing data Editing data collected through structured inquiries (quantitative studies) Editing data collected through unstructured interviewing Coding data: introduction Coding quantitative/categorical (qualitative and quantitative) data Developing a code book Pre-testing a code book Coding the data Verifying the coded data Coding descriptive/quantitative data Developing a frame of analysis for quantitative studies Frequency distributions Cross-tabulations Constructing the main concepts Statistical procedures Developing a frame of analysis for qualitative studies Analysing data The role of computers in research The role of statistics in research Summary 16 Displaying data . Tables Structure Types of tables Types of percentages Graphs The histogram The bar chart The stacked bar chart The 100 per cent bar chart The frequency polygon The cumulative frequency polygon The stem-and-leaf display Cc The pie chart The line diagram or trend curve The area chart The scattergram Summary Step VIII Writing a research report 17 Writing a research report Research writing in general Referencing Writing a bibliography Developing an outline Writing about a variable Summary 18 Research methodoiogy and practice evaiuation What is evaluation? Why evaluation? Intervention-development-evaluation process Perspectives in the classification of evaluation studies Tj'pes of evaluation from a focus perspective Evaluation for planning a program/intervention Process/monitoring evaluation Impact/outcome evaluation Cost-benefit/cost-effectiveness evaluation ' Types of evaluation firom a philosophical perspective Goal-centered/objective-oriented evaluation Consumer-oriented/client-centred evaluation Improvement-oriented evaluation Holistic/illuminative evaluation Understanding an evaluation: the process Involving stakeholders in evaluation Ethics in evaluation Summary
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)
Holdings
Item type Current library Call number Status Date due Barcode Item holds
General Books General Books Central Library, Sikkim University
General Book Section
001.42 KUM/R (Browse shelf(Opens below)) Available P39505
Total holds: 0

1 Research: a way of thinidng
Research: a way of examining your practice
Applications of research
Definitions of research
Characteristics of research
Types of research
Application
Objectives
Inquiry mode
Paradigms of research
Summary
2 The research process: a quick glance
The research process: an eight-step model
Steps in planning a research study
Step I: formulating a research problem
Step II: conceptualising a research design
Step III: constructing an instrument for data collection
Step IV: selecting a sample
Step V: writing a research proposal
Steps in conducting a study
Step VT: collecting data
Step VII: processing data
Step VIII: writing a research report
Summary
step I Fonnulating a research problem
3 Reviewing the iiterature
Place of literature review in research
Bring clarity and focus to your research problem
Improve your methodology
Broaden your knowledge base in your research area
Contextualise your findings
Procedure for reviewing the literature
Search for existing literature
Review the literature selected
Develop a theoretical framewo^^
Develop a conceptual framewmjis-'
• Writing up the literature reviewed
Summary
4 Fonnuiating a research probiem
The research problem
The importance of formulating a research problem
Sources of research problems
Considerations in selecting a research problem
Steps in the formulation of a research problem
The formulation of objectives
Establishing operational definitions
Summary
5 Identifying variables
The definition of a variable
The difference between a concept and a variable
Concepts, indicators and variables
Types of variable
From the viewpoint of causation
From the viewpoint of the study design
From the viewpoint of the unit of measurement
Types of measurement scale
The nominal or classificatoiy scale
The ordinal or ranking scale
The interval scale
The ratio scale
Summary
6 Constructing hypotheses
The definition of a hypothesis
TTie functions of a hypothesis
The characteristics of a hypothesis
Types of hypothesis
Errors in testing a hypothesis
Summary
step II Conceptualising a research design
7 The research design
The definition of a research design
The functions of a research design
Summary
8 Selecting a study design
Study designs based on the number of contacts
The cross-sectional study design
The before-and-after study design
The longitudinal study design
Study designs based on the reference period
The retrospective study design
The prospective study design
The retrospective-prospective study design
Study designs based on the nature of the investigation ,
The experimental study designs
Others—some commonly used study designs
Action research
Feminist research
The cross-over comparative experimental design
The replicated cross-sectional design
Trend studies
Cohort studies
Panel studies
Blind studies
Double-blind studies
Case studies
Summary
Step Hi Constructing an instrument for data collection
9 Selecting a method of data collection
Methods of data collection
Collecting data using primary sources
Observation
Types of observation
Problems with using observation as a method of
data collection
Situations in which observation can be made
The recording of observation
The interview
Unstructured interviews
Structured interview
The questionnaire
Choosing between an interview schedule and a
questionnaire
Different ways of administering a questionnaire
The contents of the covering letter
Advantages of a questionnaire
Disadvantages of a questionnaire
Advantages of the interview
Disadvantages of the interview
Forms of question
Advantages and disadvantages of open-ended
questions
Advantages and disadvantages of closed-ended
questions
Considerations in formulating questions
The construction of a research instrument
Asking personal and sensitive questions
The order of questions
Prerequisites for data collection
Collecting data using secondary sources
Problems with using data from secondary sources
Summary
10 Collecting data using attltudlnal scales
Functions of attitudinal scales
Difficulties in developing an attitudinal scale
. Types of attitudinal scale
The summated rating or Likert scale
The equal-appearing interval or Thurstone scale
The cumulative or Guttman scale
The relationship between attitudinal and
measurement scales
Summary
11 Establishing the validity and reliability
of a research Instrument n
The concept of validity
Types of validity
Face and content validity
Concurrent and predictive validity
Construct validity
The concept of reliability
Factors affecting the reliability of a research instrument
Methods of determining the reliability of an instrument
External consistency procedures
Internal consistency procedures
Summary
Step IV Selecting a sample
12 Sampling
The concept of sampling
The concept of sampling in qualitative research
Sampling terminology
Principles of sampling
Factors affecting the inferences drawn from a sample
Aims in selecting a sample
Types of sampling
Random/probability sampling designs
Non-random/non-probability sampling designs
'Mixed' sampling designs
The calculation of sample size
Summary
Step V Writing a research proposal
13 Writing a research proposal
The research proposal
Contents of a research proposal
Preamble/introduction
The problem
Objectives of the study
Hj^otheses to be tested
Study design
The setting
Measurement procedures
Ethical issues
Sampling
Analysis of data
Structure of the report
Problems and limitations
Appendix
Work schedule
Summary
Step VI Collecting data
14 Considering ethical issues In data collection
Ethics
Stakeholders in research
Ethical issues concerning research participants
Collecting information
Seeking consent
Providing incentives
Seeking sensitive information
The possibility of causing harm to participants
Maintaining confidentiality
Ethical issues relating to the researcher
Avoiding bias
Provision or deprivation of a treatment
Using inappropriate research methodology
Incorrect reporting ''
Inappropriate use of information
Ethical issues regarding the sponsoring organisation
Restrictions imposed by the sponsoring organisation
The misuse of information
Summary
Step Vll Processing data
15 Processing data
Editing data collected through structured inquiries
(quantitative studies)
Editing data collected through unstructured interviewing
Coding data: introduction
Coding quantitative/categorical (qualitative and
quantitative) data
Developing a code book
Pre-testing a code book
Coding the data
Verifying the coded data
Coding descriptive/quantitative data
Developing a frame of analysis for quantitative studies
Frequency distributions
Cross-tabulations
Constructing the main concepts
Statistical procedures
Developing a frame of analysis for qualitative studies
Analysing data
The role of computers in research
The role of statistics in research
Summary
16 Displaying data .
Tables
Structure
Types of tables
Types of percentages
Graphs
The histogram
The bar chart
The stacked bar chart
The 100 per cent bar chart
The frequency polygon
The cumulative frequency polygon
The stem-and-leaf display
Cc
The pie chart
The line diagram or trend curve
The area chart
The scattergram
Summary
Step VIII Writing a research report
17 Writing a research report
Research writing in general
Referencing
Writing a bibliography
Developing an outline
Writing about a variable
Summary
18 Research methodoiogy and practice evaiuation
What is evaluation?
Why evaluation?
Intervention-development-evaluation process
Perspectives in the classification of evaluation studies
Tj'pes of evaluation from a focus perspective
Evaluation for planning a program/intervention
Process/monitoring evaluation
Impact/outcome evaluation
Cost-benefit/cost-effectiveness evaluation '
Types of evaluation firom a philosophical perspective
Goal-centered/objective-oriented evaluation
Consumer-oriented/client-centred evaluation
Improvement-oriented evaluation
Holistic/illuminative evaluation
Understanding an evaluation: the process
Involving stakeholders in evaluation
Ethics in evaluation
Summary

There are no comments on this title.

to post a comment.
SIKKIM UNIVERSITY
University Portal | Contact Librarian | Library Portal

Powered by Koha