Search In this Thesis
   Search In this Thesis  
العنوان
Dominance analysis approach for comparing predictors in linear models /
الناشر
Mohamed Rashed Ezz Eldin Mohamed ,
المؤلف
Mohamed Rashed Ezz Eldin Mohamed
هيئة الاعداد
باحث / Mohamed Rashed Ezzeldin Mohamed
مشرف / Elhoussainy Abdel Bar Rady
مشرف / Ahmed Amin Elsheikh
مناقش / Sayed Meshaal Elsayed
مناقش / Mai Mohamed Kamel
تاريخ النشر
2021
عدد الصفحات
157 Leaves :
اللغة
الإنجليزية
الدرجة
الدكتوراه
التخصص
الإحصاء والاحتمالات
تاريخ الإجازة
1/11/2021
مكان الإجازة
اتحاد مكتبات الجامعات المصرية - Applied Statistics and Econometrics
الفهرس
Only 14 pages are availabe for public view

from 171

from 171

Abstract

Many methods had been recommended to explain the ambiguous concept of relative importance for independent variables in multiple regression models. One of the most important approaches for determining the relative importance of predictors is dominance analysis, which is a technique that determines variable importance, based on comparisons of unique variance contributions of all pairs of variables to regression equations involving all possible subsets of predictors.Relative importance analysis (e.g Dominance Analysis) is a very useful supplement to regression analysis. The purpose of determining predictor importance is not model selection but rather uncovering the individual contributions of predictors relative to each other within a selected model.Dominance analysis offers a general framework for the determination of relative importance of predictors in univariate and multivariate multiple regression models.This approach relies on pairwise comparisons of the contribution of predictors in all relevant subset models.One of the great weaknesses of Dominance Analysis (DA) which is, it turns into extra computationally hard because of the exponentially growing number of sub-models involved.That is due to the fact for predictors, there are 2^P {u2013} 1submodel. As an instance, with 3 predictors a Dominance Analysis summary table will have seven possible models (i.e. (1) X₁ (2) X₂, (3) X₃, (4) X₁and X₂, (5) X₂and X₃, (6) X₁ and X₃, (7) X₁, X₂, and X₃). Likewise, 10 predictors will yield 1023 possible sub-models