Some people think that the Earth is being damaged by human activity and our planet is dying. Others think that human activity makes the Earth a better place to live.
Probably there are so many opinions about this statement, but in my opinion human being totally destroys the Earth. Yes, we do make it more comfortable for us, but what about nature? Do we think that we are damaging the Earth by cutting down trees to make a paper, furniture, killing animals to eat them and make some kind of fancy clothes and so on? This sound like this; “We make the world a better place to live by destroying it”.