I was reading one of my many fashion and beauty mags the other day, and they were highlighting trends for the new season. One trend they were touting was white fingernail polish.
I thought about this for a bit and came to the realization that white fingernails are not for me. Even when they're done right, white nails look sickly. When done wrong, it looks like you painted your nails with White Out. I don't think it's a flattering or healthy look, especially on fair skinned people like me. I, for one, will be skipping this trend.
But what do I know? I am far from an authority on trends, but I can't see falling for white fingernails. Nude nails? Yes. French manicures? Hell no. But what do you think? Would you paint your nails white for the upcoming warmer months? Leave a comment and let me know!