From 0c872507d84a6193cd769a808e65d34e1514b083 Mon Sep 17 00:00:00 2001 From: Eduardo Valentin Date: Wed, 29 May 2013 21:37:00 +0000 Subject: [PATCH] thermal: consider emul_temperature while computing trend In case emulated temperature is in use, using the trend provided by driver layer can lead to bogus situation. In this case, debugger user would set a temperature value, but the trend would be from driver computation. To avoid this situation, this patch changes the get_tz_trend() to consider the emulated temperature whenever that is in use. Cc: Zhang Rui Cc: Amit Daniel Kachhap Cc: Durgadoss R Cc: linux-pm@vger.kernel.org Cc: linux-kernel@vger.kernel.org Signed-off-by: Eduardo Valentin Signed-off-by: Zhang Rui --- drivers/thermal/thermal_core.c | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/drivers/thermal/thermal_core.c b/drivers/thermal/thermal_core.c index d755440791b7..c00dc9275fc0 100644 --- a/drivers/thermal/thermal_core.c +++ b/drivers/thermal/thermal_core.c @@ -155,7 +155,8 @@ int get_tz_trend(struct thermal_zone_device *tz, int trip) { enum thermal_trend trend; - if (!tz->ops->get_trend || tz->ops->get_trend(tz, trip, &trend)) { + if (tz->emul_temperature || !tz->ops->get_trend || + tz->ops->get_trend(tz, trip, &trend)) { if (tz->temperature > tz->last_temperature) trend = THERMAL_TREND_RAISING; else if (tz->temperature < tz->last_temperature) -- 2.39.5