In this work, we propose to automate the pre-cancerous tissue abnormality analysis by performing the classification of image patches using a novel two-stage convolutional neural network (CNN) based framework. Rather than training a model with features that may correlate among various classes, we propose to train a model using the features which vary across the different classes. Our framework processes the input image to locate the region of interest (glandular structures) and then feeds the processed image to a classification model for abnormality prediction. Our experiments show that our proposed approach improves the classification performance by up to 7% using CNNs and more than 10% while using texture descriptors. When testing with gland segmented images, our experiments reveal that the performance of our classification approach is dependent on the gland segmentation approach which is a key task in gland structure-guided classification.